00:00:00.001 Started by upstream project "spdk-dpdk-per-patch" build number 235 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.014 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.015 The recommended git tool is: git 00:00:00.015 using credential 00000000-0000-0000-0000-000000000002 00:00:00.016 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.029 Fetching changes from the remote Git repository 00:00:00.031 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.045 Using shallow fetch with depth 1 00:00:00.045 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.045 > git --version # timeout=10 00:00:00.065 > git --version # 'git version 2.39.2' 00:00:00.065 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.066 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.066 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.188 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.201 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.213 Checking out Revision f620ee97e10840540f53609861ee9b86caa3c192 (FETCH_HEAD) 00:00:02.213 > git config core.sparsecheckout # timeout=10 00:00:02.225 > git read-tree -mu HEAD # timeout=10 00:00:02.241 > git checkout -f f620ee97e10840540f53609861ee9b86caa3c192 # timeout=5 00:00:02.261 Commit message: "change IP of vertiv1 PDU" 00:00:02.261 > git rev-list --no-walk f620ee97e10840540f53609861ee9b86caa3c192 # timeout=10 00:00:02.338 [Pipeline] Start of Pipeline 00:00:02.353 [Pipeline] library 00:00:02.354 Loading library shm_lib@master 00:00:02.355 Library shm_lib@master is cached. Copying from home. 00:00:02.370 [Pipeline] node 00:00:02.378 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.381 [Pipeline] { 00:00:02.395 [Pipeline] catchError 00:00:02.397 [Pipeline] { 00:00:02.413 [Pipeline] wrap 00:00:02.426 [Pipeline] { 00:00:02.433 [Pipeline] stage 00:00:02.434 [Pipeline] { (Prologue) 00:00:02.640 [Pipeline] sh 00:00:02.923 + logger -p user.info -t JENKINS-CI 00:00:02.936 [Pipeline] echo 00:00:02.936 Node: WFP50 00:00:02.944 [Pipeline] sh 00:00:03.238 [Pipeline] setCustomBuildProperty 00:00:03.248 [Pipeline] echo 00:00:03.250 Cleanup processes 00:00:03.254 [Pipeline] sh 00:00:03.538 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.538 1508488 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.549 [Pipeline] sh 00:00:03.827 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.827 ++ grep -v 'sudo pgrep' 00:00:03.827 ++ awk '{print $1}' 00:00:03.827 + sudo kill -9 00:00:03.827 + true 00:00:03.839 [Pipeline] cleanWs 00:00:03.848 [WS-CLEANUP] Deleting project workspace... 00:00:03.848 [WS-CLEANUP] Deferred wipeout is used... 00:00:03.854 [WS-CLEANUP] done 00:00:03.857 [Pipeline] setCustomBuildProperty 00:00:03.869 [Pipeline] sh 00:00:04.147 + sudo git config --global --replace-all safe.directory '*' 00:00:04.199 [Pipeline] nodesByLabel 00:00:04.201 Found a total of 1 nodes with the 'sorcerer' label 00:00:04.208 [Pipeline] httpRequest 00:00:04.211 HttpMethod: GET 00:00:04.212 URL: http://10.211.164.101/packages/jbp_f620ee97e10840540f53609861ee9b86caa3c192.tar.gz 00:00:04.214 Sending request to url: http://10.211.164.101/packages/jbp_f620ee97e10840540f53609861ee9b86caa3c192.tar.gz 00:00:04.217 Response Code: HTTP/1.1 200 OK 00:00:04.218 Success: Status code 200 is in the accepted range: 200,404 00:00:04.218 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f620ee97e10840540f53609861ee9b86caa3c192.tar.gz 00:00:04.615 [Pipeline] sh 00:00:04.905 + tar --no-same-owner -xf jbp_f620ee97e10840540f53609861ee9b86caa3c192.tar.gz 00:00:04.923 [Pipeline] httpRequest 00:00:04.926 HttpMethod: GET 00:00:04.926 URL: http://10.211.164.101/packages/spdk_b68ae4fb9294e2067b21e3ded559f637585386b4.tar.gz 00:00:04.927 Sending request to url: http://10.211.164.101/packages/spdk_b68ae4fb9294e2067b21e3ded559f637585386b4.tar.gz 00:00:04.930 Response Code: HTTP/1.1 200 OK 00:00:04.931 Success: Status code 200 is in the accepted range: 200,404 00:00:04.931 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_b68ae4fb9294e2067b21e3ded559f637585386b4.tar.gz 00:00:17.386 [Pipeline] sh 00:00:17.666 + tar --no-same-owner -xf spdk_b68ae4fb9294e2067b21e3ded559f637585386b4.tar.gz 00:00:20.211 [Pipeline] sh 00:00:20.492 + git -C spdk log --oneline -n5 00:00:20.492 b68ae4fb9 nvmf-tcp: Added queue depth tracing support 00:00:20.492 46d7b94f0 nvmf-rdma: Added queue depth tracing support 00:00:20.492 0127345c8 nvme-tcp: Added queue depth tracing support 00:00:20.492 887390405 nvme-pcie: Added queue depth tracing support 00:00:20.492 2a75dcc9a lib/bdev: Added queue depth tracing support 00:00:20.506 [Pipeline] sh 00:00:20.790 + git -C spdk/dpdk fetch https://review.spdk.io/gerrit/spdk/dpdk refs/changes/88/22688/3 00:00:21.727 From https://review.spdk.io/gerrit/spdk/dpdk 00:00:21.727 * branch refs/changes/88/22688/3 -> FETCH_HEAD 00:00:21.739 [Pipeline] sh 00:00:22.055 + git -C spdk/dpdk checkout FETCH_HEAD 00:00:22.999 Previous HEAD position was db99adb13f kernel/freebsd: fix module build on FreeBSD 14 00:00:22.999 HEAD is now at 04f9dc6803 meson/mlx5: Suppress -Wunused-value diagnostic 00:00:23.012 [Pipeline] } 00:00:23.035 [Pipeline] // stage 00:00:23.050 [Pipeline] stage 00:00:23.056 [Pipeline] { (Prepare) 00:00:23.092 [Pipeline] writeFile 00:00:23.103 [Pipeline] sh 00:00:23.378 + logger -p user.info -t JENKINS-CI 00:00:23.390 [Pipeline] sh 00:00:23.670 + logger -p user.info -t JENKINS-CI 00:00:23.683 [Pipeline] sh 00:00:23.963 + cat autorun-spdk.conf 00:00:23.963 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:23.963 SPDK_TEST_BLOCKDEV=1 00:00:23.964 SPDK_TEST_ISAL=1 00:00:23.964 SPDK_TEST_CRYPTO=1 00:00:23.964 SPDK_TEST_REDUCE=1 00:00:23.964 SPDK_TEST_VBDEV_COMPRESS=1 00:00:23.964 SPDK_RUN_UBSAN=1 00:00:23.970 RUN_NIGHTLY= 00:00:23.975 [Pipeline] readFile 00:00:23.998 [Pipeline] withEnv 00:00:24.000 [Pipeline] { 00:00:24.012 [Pipeline] sh 00:00:24.294 + set -ex 00:00:24.294 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:24.294 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:24.294 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:24.294 ++ SPDK_TEST_BLOCKDEV=1 00:00:24.294 ++ SPDK_TEST_ISAL=1 00:00:24.294 ++ SPDK_TEST_CRYPTO=1 00:00:24.294 ++ SPDK_TEST_REDUCE=1 00:00:24.294 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:24.294 ++ SPDK_RUN_UBSAN=1 00:00:24.294 ++ RUN_NIGHTLY= 00:00:24.294 + case $SPDK_TEST_NVMF_NICS in 00:00:24.294 + DRIVERS= 00:00:24.294 + [[ -n '' ]] 00:00:24.294 + exit 0 00:00:24.303 [Pipeline] } 00:00:24.320 [Pipeline] // withEnv 00:00:24.324 [Pipeline] } 00:00:24.341 [Pipeline] // stage 00:00:24.350 [Pipeline] catchError 00:00:24.351 [Pipeline] { 00:00:24.364 [Pipeline] timeout 00:00:24.364 Timeout set to expire in 40 min 00:00:24.365 [Pipeline] { 00:00:24.377 [Pipeline] stage 00:00:24.379 [Pipeline] { (Tests) 00:00:24.392 [Pipeline] sh 00:00:24.674 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:24.674 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:24.674 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:24.674 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:24.674 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:24.674 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:24.674 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:24.674 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:24.674 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:24.674 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:24.674 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:24.674 + source /etc/os-release 00:00:24.674 ++ NAME='Fedora Linux' 00:00:24.674 ++ VERSION='38 (Cloud Edition)' 00:00:24.674 ++ ID=fedora 00:00:24.674 ++ VERSION_ID=38 00:00:24.674 ++ VERSION_CODENAME= 00:00:24.674 ++ PLATFORM_ID=platform:f38 00:00:24.674 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:24.674 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:24.674 ++ LOGO=fedora-logo-icon 00:00:24.674 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:24.674 ++ HOME_URL=https://fedoraproject.org/ 00:00:24.674 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:24.674 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:24.674 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:24.674 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:24.674 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:24.674 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:24.674 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:24.674 ++ SUPPORT_END=2024-05-14 00:00:24.674 ++ VARIANT='Cloud Edition' 00:00:24.674 ++ VARIANT_ID=cloud 00:00:24.674 + uname -a 00:00:24.674 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:24.674 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:27.962 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:27.962 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:27.962 Hugepages 00:00:27.962 node hugesize free / total 00:00:27.962 node0 1048576kB 0 / 0 00:00:27.962 node0 2048kB 0 / 0 00:00:27.962 node1 1048576kB 0 / 0 00:00:27.962 node1 2048kB 0 / 0 00:00:27.962 00:00:27.962 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:27.962 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:27.962 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:27.962 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:27.962 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:27.962 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:28.221 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:28.221 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:00:28.221 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:00:28.221 + rm -f /tmp/spdk-ld-path 00:00:28.221 + source autorun-spdk.conf 00:00:28.221 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:28.221 ++ SPDK_TEST_BLOCKDEV=1 00:00:28.221 ++ SPDK_TEST_ISAL=1 00:00:28.221 ++ SPDK_TEST_CRYPTO=1 00:00:28.221 ++ SPDK_TEST_REDUCE=1 00:00:28.221 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:28.221 ++ SPDK_RUN_UBSAN=1 00:00:28.221 ++ RUN_NIGHTLY= 00:00:28.221 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:28.221 + [[ -n '' ]] 00:00:28.221 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:28.221 + for M in /var/spdk/build-*-manifest.txt 00:00:28.221 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:28.221 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:28.221 + for M in /var/spdk/build-*-manifest.txt 00:00:28.221 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:28.221 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:28.221 ++ uname 00:00:28.221 + [[ Linux == \L\i\n\u\x ]] 00:00:28.221 + sudo dmesg -T 00:00:28.221 + sudo dmesg --clear 00:00:28.221 + dmesg_pid=1509495 00:00:28.221 + [[ Fedora Linux == FreeBSD ]] 00:00:28.221 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:28.221 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:28.221 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:28.221 + [[ -x /usr/src/fio-static/fio ]] 00:00:28.221 + export FIO_BIN=/usr/src/fio-static/fio 00:00:28.221 + FIO_BIN=/usr/src/fio-static/fio 00:00:28.221 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:28.221 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:28.221 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:28.221 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:28.221 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:28.221 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:28.221 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:28.221 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:28.221 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:28.221 + sudo dmesg -Tw 00:00:28.221 Test configuration: 00:00:28.221 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:28.221 SPDK_TEST_BLOCKDEV=1 00:00:28.221 SPDK_TEST_ISAL=1 00:00:28.221 SPDK_TEST_CRYPTO=1 00:00:28.221 SPDK_TEST_REDUCE=1 00:00:28.221 SPDK_TEST_VBDEV_COMPRESS=1 00:00:28.221 SPDK_RUN_UBSAN=1 00:00:28.533 RUN_NIGHTLY= 11:36:55 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:28.533 11:36:55 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:28.533 11:36:55 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:28.533 11:36:55 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:28.533 11:36:55 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:28.533 11:36:55 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:28.533 11:36:55 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:28.533 11:36:55 -- paths/export.sh@5 -- $ export PATH 00:00:28.534 11:36:55 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:28.534 11:36:55 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:28.534 11:36:55 -- common/autobuild_common.sh@437 -- $ date +%s 00:00:28.534 11:36:55 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715679415.XXXXXX 00:00:28.534 11:36:55 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715679415.nnUX5q 00:00:28.534 11:36:55 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:00:28.534 11:36:55 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:00:28.534 11:36:55 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:28.534 11:36:55 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:28.534 11:36:55 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:28.534 11:36:55 -- common/autobuild_common.sh@453 -- $ get_config_params 00:00:28.534 11:36:55 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:00:28.534 11:36:55 -- common/autotest_common.sh@10 -- $ set +x 00:00:28.534 11:36:55 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:28.534 11:36:55 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:00:28.534 11:36:55 -- pm/common@17 -- $ local monitor 00:00:28.534 11:36:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.534 11:36:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.534 11:36:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.534 11:36:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.534 11:36:55 -- pm/common@21 -- $ date +%s 00:00:28.534 11:36:55 -- pm/common@25 -- $ sleep 1 00:00:28.534 11:36:55 -- pm/common@21 -- $ date +%s 00:00:28.534 11:36:55 -- pm/common@21 -- $ date +%s 00:00:28.534 11:36:55 -- pm/common@21 -- $ date +%s 00:00:28.534 11:36:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715679415 00:00:28.534 11:36:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715679415 00:00:28.534 11:36:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715679415 00:00:28.534 11:36:55 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715679415 00:00:28.534 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715679415_collect-vmstat.pm.log 00:00:28.534 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715679415_collect-cpu-load.pm.log 00:00:28.534 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715679415_collect-cpu-temp.pm.log 00:00:28.534 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715679415_collect-bmc-pm.bmc.pm.log 00:00:29.472 11:36:56 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:00:29.472 11:36:56 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:29.472 11:36:56 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:29.472 11:36:56 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:29.472 11:36:56 -- spdk/autobuild.sh@16 -- $ date -u 00:00:29.472 Tue May 14 09:36:56 AM UTC 2024 00:00:29.472 11:36:56 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:29.472 v24.05-pre-610-gb68ae4fb9 00:00:29.472 11:36:56 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:29.472 11:36:56 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:29.472 11:36:56 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:29.472 11:36:56 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:00:29.472 11:36:56 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:29.472 11:36:56 -- common/autotest_common.sh@10 -- $ set +x 00:00:29.472 ************************************ 00:00:29.472 START TEST ubsan 00:00:29.472 ************************************ 00:00:29.472 11:36:56 ubsan -- common/autotest_common.sh@1121 -- $ echo 'using ubsan' 00:00:29.472 using ubsan 00:00:29.472 00:00:29.472 real 0m0.000s 00:00:29.472 user 0m0.000s 00:00:29.472 sys 0m0.000s 00:00:29.472 11:36:56 ubsan -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:00:29.472 11:36:56 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:29.472 ************************************ 00:00:29.472 END TEST ubsan 00:00:29.472 ************************************ 00:00:29.472 11:36:56 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:29.472 11:36:56 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:29.472 11:36:56 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:29.472 11:36:56 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:29.731 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:29.731 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:29.990 Using 'verbs' RDMA provider 00:00:46.284 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:01.173 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:01.173 Creating mk/config.mk...done. 00:01:01.173 Creating mk/cc.flags.mk...done. 00:01:01.173 Type 'make' to build. 00:01:01.173 11:37:27 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:01.173 11:37:27 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:01:01.173 11:37:27 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:01:01.173 11:37:27 -- common/autotest_common.sh@10 -- $ set +x 00:01:01.173 ************************************ 00:01:01.173 START TEST make 00:01:01.173 ************************************ 00:01:01.173 11:37:27 make -- common/autotest_common.sh@1121 -- $ make -j72 00:01:01.173 make[1]: Nothing to be done for 'all'. 00:01:39.913 The Meson build system 00:01:39.913 Version: 1.3.1 00:01:39.913 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:39.913 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:39.913 Build type: native build 00:01:39.913 Program cat found: YES (/usr/bin/cat) 00:01:39.913 Project name: DPDK 00:01:39.913 Project version: 24.03.0 00:01:39.913 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:39.913 C linker for the host machine: cc ld.bfd 2.39-16 00:01:39.913 Host machine cpu family: x86_64 00:01:39.913 Host machine cpu: x86_64 00:01:39.913 Message: ## Building in Developer Mode ## 00:01:39.913 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:39.913 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:39.913 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:39.913 Program python3 found: YES (/usr/bin/python3) 00:01:39.913 Program cat found: YES (/usr/bin/cat) 00:01:39.913 Compiler for C supports arguments -march=native: YES 00:01:39.913 Checking for size of "void *" : 8 00:01:39.913 Checking for size of "void *" : 8 (cached) 00:01:39.913 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:39.913 Library m found: YES 00:01:39.913 Library numa found: YES 00:01:39.913 Has header "numaif.h" : YES 00:01:39.913 Library fdt found: NO 00:01:39.913 Library execinfo found: NO 00:01:39.913 Has header "execinfo.h" : YES 00:01:39.913 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:39.913 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:39.913 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:39.913 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:39.913 Run-time dependency openssl found: YES 3.0.9 00:01:39.913 Run-time dependency libpcap found: YES 1.10.4 00:01:39.913 Has header "pcap.h" with dependency libpcap: YES 00:01:39.913 Compiler for C supports arguments -Wcast-qual: YES 00:01:39.913 Compiler for C supports arguments -Wdeprecated: YES 00:01:39.913 Compiler for C supports arguments -Wformat: YES 00:01:39.913 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:39.913 Compiler for C supports arguments -Wformat-security: NO 00:01:39.913 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:39.913 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:39.913 Compiler for C supports arguments -Wnested-externs: YES 00:01:39.913 Compiler for C supports arguments -Wold-style-definition: YES 00:01:39.913 Compiler for C supports arguments -Wpointer-arith: YES 00:01:39.913 Compiler for C supports arguments -Wsign-compare: YES 00:01:39.913 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:39.913 Compiler for C supports arguments -Wundef: YES 00:01:39.913 Compiler for C supports arguments -Wwrite-strings: YES 00:01:39.913 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:39.913 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:39.913 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:39.913 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:39.913 Program objdump found: YES (/usr/bin/objdump) 00:01:39.913 Compiler for C supports arguments -mavx512f: YES 00:01:39.913 Checking if "AVX512 checking" compiles: YES 00:01:39.913 Fetching value of define "__SSE4_2__" : 1 00:01:39.913 Fetching value of define "__AES__" : 1 00:01:39.913 Fetching value of define "__AVX__" : 1 00:01:39.913 Fetching value of define "__AVX2__" : 1 00:01:39.913 Fetching value of define "__AVX512BW__" : 1 00:01:39.913 Fetching value of define "__AVX512CD__" : 1 00:01:39.913 Fetching value of define "__AVX512DQ__" : 1 00:01:39.913 Fetching value of define "__AVX512F__" : 1 00:01:39.913 Fetching value of define "__AVX512VL__" : 1 00:01:39.913 Fetching value of define "__PCLMUL__" : 1 00:01:39.913 Fetching value of define "__RDRND__" : 1 00:01:39.913 Fetching value of define "__RDSEED__" : 1 00:01:39.913 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:39.913 Fetching value of define "__znver1__" : (undefined) 00:01:39.913 Fetching value of define "__znver2__" : (undefined) 00:01:39.913 Fetching value of define "__znver3__" : (undefined) 00:01:39.913 Fetching value of define "__znver4__" : (undefined) 00:01:39.913 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:39.913 Message: lib/log: Defining dependency "log" 00:01:39.913 Message: lib/kvargs: Defining dependency "kvargs" 00:01:39.913 Message: lib/telemetry: Defining dependency "telemetry" 00:01:39.913 Checking for function "getentropy" : NO 00:01:39.913 Message: lib/eal: Defining dependency "eal" 00:01:39.914 Message: lib/ring: Defining dependency "ring" 00:01:39.914 Message: lib/rcu: Defining dependency "rcu" 00:01:39.914 Message: lib/mempool: Defining dependency "mempool" 00:01:39.914 Message: lib/mbuf: Defining dependency "mbuf" 00:01:39.914 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:39.914 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:39.914 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:39.914 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:39.914 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:39.914 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:39.914 Compiler for C supports arguments -mpclmul: YES 00:01:39.914 Compiler for C supports arguments -maes: YES 00:01:39.914 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:39.914 Compiler for C supports arguments -mavx512bw: YES 00:01:39.914 Compiler for C supports arguments -mavx512dq: YES 00:01:39.914 Compiler for C supports arguments -mavx512vl: YES 00:01:39.914 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:39.914 Compiler for C supports arguments -mavx2: YES 00:01:39.914 Compiler for C supports arguments -mavx: YES 00:01:39.914 Message: lib/net: Defining dependency "net" 00:01:39.914 Message: lib/meter: Defining dependency "meter" 00:01:39.914 Message: lib/ethdev: Defining dependency "ethdev" 00:01:39.914 Message: lib/pci: Defining dependency "pci" 00:01:39.914 Message: lib/cmdline: Defining dependency "cmdline" 00:01:39.914 Message: lib/hash: Defining dependency "hash" 00:01:39.914 Message: lib/timer: Defining dependency "timer" 00:01:39.914 Message: lib/compressdev: Defining dependency "compressdev" 00:01:39.914 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:39.914 Message: lib/dmadev: Defining dependency "dmadev" 00:01:39.914 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:39.914 Message: lib/power: Defining dependency "power" 00:01:39.914 Message: lib/reorder: Defining dependency "reorder" 00:01:39.914 Message: lib/security: Defining dependency "security" 00:01:39.914 lib/meson.build:163: WARNING: Cannot disable mandatory library "stack" 00:01:39.914 Message: lib/stack: Defining dependency "stack" 00:01:39.914 Has header "linux/userfaultfd.h" : YES 00:01:39.914 Has header "linux/vduse.h" : YES 00:01:39.914 Message: lib/vhost: Defining dependency "vhost" 00:01:39.914 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:39.914 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:39.914 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:39.914 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:39.914 Compiler for C supports arguments -std=c11: YES 00:01:39.914 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:39.914 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:39.914 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:39.914 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:39.914 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:39.914 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:39.914 Library mtcr_ul found: NO 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:39.914 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:39.914 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:43.266 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:43.266 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:43.266 Configuring mlx5_autoconf.h using configuration 00:01:43.266 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:43.266 Run-time dependency libcrypto found: YES 3.0.9 00:01:43.266 Library IPSec_MB found: YES 00:01:43.266 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:43.266 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:43.266 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:43.266 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:43.266 Library IPSec_MB found: YES 00:01:43.266 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:43.266 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:43.266 Compiler for C supports arguments -std=c11: YES (cached) 00:01:43.266 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:43.266 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:43.266 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:43.266 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:43.266 Compiler for C supports arguments -std=c11: YES (cached) 00:01:43.266 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:43.266 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:43.266 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:43.266 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:43.266 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:43.266 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:43.266 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:43.266 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:43.266 Program doxygen found: YES (/usr/bin/doxygen) 00:01:43.266 Configuring doxy-api-html.conf using configuration 00:01:43.266 Configuring doxy-api-man.conf using configuration 00:01:43.266 Program mandb found: YES (/usr/bin/mandb) 00:01:43.266 Program sphinx-build found: NO 00:01:43.266 Configuring rte_build_config.h using configuration 00:01:43.266 Message: 00:01:43.266 ================= 00:01:43.266 Applications Enabled 00:01:43.266 ================= 00:01:43.266 00:01:43.266 apps: 00:01:43.266 00:01:43.266 00:01:43.266 Message: 00:01:43.266 ================= 00:01:43.266 Libraries Enabled 00:01:43.266 ================= 00:01:43.266 00:01:43.266 libs: 00:01:43.266 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:43.266 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:43.266 cryptodev, dmadev, power, reorder, security, stack, vhost, 00:01:43.266 00:01:43.266 Message: 00:01:43.266 =============== 00:01:43.266 Drivers Enabled 00:01:43.266 =============== 00:01:43.266 00:01:43.266 common: 00:01:43.266 mlx5, qat, 00:01:43.266 bus: 00:01:43.266 auxiliary, pci, vdev, 00:01:43.266 mempool: 00:01:43.266 ring, 00:01:43.266 dma: 00:01:43.266 00:01:43.266 net: 00:01:43.266 00:01:43.266 crypto: 00:01:43.266 ipsec_mb, mlx5, 00:01:43.266 compress: 00:01:43.267 isal, mlx5, 00:01:43.267 vdpa: 00:01:43.267 00:01:43.267 00:01:43.267 Message: 00:01:43.267 ================= 00:01:43.267 Content Skipped 00:01:43.267 ================= 00:01:43.267 00:01:43.267 apps: 00:01:43.267 dumpcap: explicitly disabled via build config 00:01:43.267 graph: explicitly disabled via build config 00:01:43.267 pdump: explicitly disabled via build config 00:01:43.267 proc-info: explicitly disabled via build config 00:01:43.267 test-acl: explicitly disabled via build config 00:01:43.267 test-bbdev: explicitly disabled via build config 00:01:43.267 test-cmdline: explicitly disabled via build config 00:01:43.267 test-compress-perf: explicitly disabled via build config 00:01:43.267 test-crypto-perf: explicitly disabled via build config 00:01:43.267 test-dma-perf: explicitly disabled via build config 00:01:43.267 test-eventdev: explicitly disabled via build config 00:01:43.267 test-fib: explicitly disabled via build config 00:01:43.267 test-flow-perf: explicitly disabled via build config 00:01:43.267 test-gpudev: explicitly disabled via build config 00:01:43.267 test-mldev: explicitly disabled via build config 00:01:43.267 test-pipeline: explicitly disabled via build config 00:01:43.267 test-pmd: explicitly disabled via build config 00:01:43.267 test-regex: explicitly disabled via build config 00:01:43.267 test-sad: explicitly disabled via build config 00:01:43.267 test-security-perf: explicitly disabled via build config 00:01:43.267 00:01:43.267 libs: 00:01:43.267 argparse: explicitly disabled via build config 00:01:43.267 metrics: explicitly disabled via build config 00:01:43.267 acl: explicitly disabled via build config 00:01:43.267 bbdev: explicitly disabled via build config 00:01:43.267 bitratestats: explicitly disabled via build config 00:01:43.267 bpf: explicitly disabled via build config 00:01:43.267 cfgfile: explicitly disabled via build config 00:01:43.267 distributor: explicitly disabled via build config 00:01:43.267 efd: explicitly disabled via build config 00:01:43.267 eventdev: explicitly disabled via build config 00:01:43.267 dispatcher: explicitly disabled via build config 00:01:43.267 gpudev: explicitly disabled via build config 00:01:43.267 gro: explicitly disabled via build config 00:01:43.267 gso: explicitly disabled via build config 00:01:43.267 ip_frag: explicitly disabled via build config 00:01:43.267 jobstats: explicitly disabled via build config 00:01:43.267 latencystats: explicitly disabled via build config 00:01:43.267 lpm: explicitly disabled via build config 00:01:43.267 member: explicitly disabled via build config 00:01:43.267 pcapng: explicitly disabled via build config 00:01:43.267 rawdev: explicitly disabled via build config 00:01:43.267 regexdev: explicitly disabled via build config 00:01:43.267 mldev: explicitly disabled via build config 00:01:43.267 rib: explicitly disabled via build config 00:01:43.267 sched: explicitly disabled via build config 00:01:43.267 ipsec: explicitly disabled via build config 00:01:43.267 pdcp: explicitly disabled via build config 00:01:43.267 fib: explicitly disabled via build config 00:01:43.267 port: explicitly disabled via build config 00:01:43.267 pdump: explicitly disabled via build config 00:01:43.267 table: explicitly disabled via build config 00:01:43.267 pipeline: explicitly disabled via build config 00:01:43.267 graph: explicitly disabled via build config 00:01:43.267 node: explicitly disabled via build config 00:01:43.267 00:01:43.267 drivers: 00:01:43.267 common/cpt: not in enabled drivers build config 00:01:43.267 common/dpaax: not in enabled drivers build config 00:01:43.267 common/iavf: not in enabled drivers build config 00:01:43.267 common/idpf: not in enabled drivers build config 00:01:43.267 common/ionic: not in enabled drivers build config 00:01:43.267 common/mvep: not in enabled drivers build config 00:01:43.267 common/octeontx: not in enabled drivers build config 00:01:43.267 bus/cdx: not in enabled drivers build config 00:01:43.267 bus/dpaa: not in enabled drivers build config 00:01:43.267 bus/fslmc: not in enabled drivers build config 00:01:43.267 bus/ifpga: not in enabled drivers build config 00:01:43.267 bus/platform: not in enabled drivers build config 00:01:43.267 bus/uacce: not in enabled drivers build config 00:01:43.267 bus/vmbus: not in enabled drivers build config 00:01:43.267 common/cnxk: not in enabled drivers build config 00:01:43.267 common/nfp: not in enabled drivers build config 00:01:43.267 common/nitrox: not in enabled drivers build config 00:01:43.267 common/sfc_efx: not in enabled drivers build config 00:01:43.267 mempool/bucket: not in enabled drivers build config 00:01:43.267 mempool/cnxk: not in enabled drivers build config 00:01:43.267 mempool/dpaa: not in enabled drivers build config 00:01:43.267 mempool/dpaa2: not in enabled drivers build config 00:01:43.267 mempool/octeontx: not in enabled drivers build config 00:01:43.267 mempool/stack: not in enabled drivers build config 00:01:43.267 dma/cnxk: not in enabled drivers build config 00:01:43.267 dma/dpaa: not in enabled drivers build config 00:01:43.267 dma/dpaa2: not in enabled drivers build config 00:01:43.267 dma/hisilicon: not in enabled drivers build config 00:01:43.267 dma/idxd: not in enabled drivers build config 00:01:43.267 dma/ioat: not in enabled drivers build config 00:01:43.267 dma/skeleton: not in enabled drivers build config 00:01:43.267 net/af_packet: not in enabled drivers build config 00:01:43.267 net/af_xdp: not in enabled drivers build config 00:01:43.267 net/ark: not in enabled drivers build config 00:01:43.267 net/atlantic: not in enabled drivers build config 00:01:43.267 net/avp: not in enabled drivers build config 00:01:43.267 net/axgbe: not in enabled drivers build config 00:01:43.267 net/bnx2x: not in enabled drivers build config 00:01:43.267 net/bnxt: not in enabled drivers build config 00:01:43.267 net/bonding: not in enabled drivers build config 00:01:43.267 net/cnxk: not in enabled drivers build config 00:01:43.267 net/cpfl: not in enabled drivers build config 00:01:43.267 net/cxgbe: not in enabled drivers build config 00:01:43.267 net/dpaa: not in enabled drivers build config 00:01:43.267 net/dpaa2: not in enabled drivers build config 00:01:43.267 net/e1000: not in enabled drivers build config 00:01:43.267 net/ena: not in enabled drivers build config 00:01:43.267 net/enetc: not in enabled drivers build config 00:01:43.267 net/enetfec: not in enabled drivers build config 00:01:43.267 net/enic: not in enabled drivers build config 00:01:43.267 net/failsafe: not in enabled drivers build config 00:01:43.267 net/fm10k: not in enabled drivers build config 00:01:43.267 net/gve: not in enabled drivers build config 00:01:43.267 net/hinic: not in enabled drivers build config 00:01:43.267 net/hns3: not in enabled drivers build config 00:01:43.267 net/i40e: not in enabled drivers build config 00:01:43.267 net/iavf: not in enabled drivers build config 00:01:43.267 net/ice: not in enabled drivers build config 00:01:43.267 net/idpf: not in enabled drivers build config 00:01:43.267 net/igc: not in enabled drivers build config 00:01:43.267 net/ionic: not in enabled drivers build config 00:01:43.267 net/ipn3ke: not in enabled drivers build config 00:01:43.267 net/ixgbe: not in enabled drivers build config 00:01:43.267 net/mana: not in enabled drivers build config 00:01:43.267 net/memif: not in enabled drivers build config 00:01:43.267 net/mlx4: not in enabled drivers build config 00:01:43.267 net/mlx5: not in enabled drivers build config 00:01:43.267 net/mvneta: not in enabled drivers build config 00:01:43.267 net/mvpp2: not in enabled drivers build config 00:01:43.267 net/netvsc: not in enabled drivers build config 00:01:43.267 net/nfb: not in enabled drivers build config 00:01:43.267 net/nfp: not in enabled drivers build config 00:01:43.267 net/ngbe: not in enabled drivers build config 00:01:43.267 net/null: not in enabled drivers build config 00:01:43.267 net/octeontx: not in enabled drivers build config 00:01:43.267 net/octeon_ep: not in enabled drivers build config 00:01:43.267 net/pcap: not in enabled drivers build config 00:01:43.267 net/pfe: not in enabled drivers build config 00:01:43.267 net/qede: not in enabled drivers build config 00:01:43.267 net/ring: not in enabled drivers build config 00:01:43.267 net/sfc: not in enabled drivers build config 00:01:43.267 net/softnic: not in enabled drivers build config 00:01:43.267 net/tap: not in enabled drivers build config 00:01:43.267 net/thunderx: not in enabled drivers build config 00:01:43.267 net/txgbe: not in enabled drivers build config 00:01:43.267 net/vdev_netvsc: not in enabled drivers build config 00:01:43.267 net/vhost: not in enabled drivers build config 00:01:43.267 net/virtio: not in enabled drivers build config 00:01:43.267 net/vmxnet3: not in enabled drivers build config 00:01:43.267 raw/*: missing internal dependency, "rawdev" 00:01:43.267 crypto/armv8: not in enabled drivers build config 00:01:43.267 crypto/bcmfs: not in enabled drivers build config 00:01:43.267 crypto/caam_jr: not in enabled drivers build config 00:01:43.267 crypto/ccp: not in enabled drivers build config 00:01:43.267 crypto/cnxk: not in enabled drivers build config 00:01:43.267 crypto/dpaa_sec: not in enabled drivers build config 00:01:43.267 crypto/dpaa2_sec: not in enabled drivers build config 00:01:43.267 crypto/mvsam: not in enabled drivers build config 00:01:43.267 crypto/nitrox: not in enabled drivers build config 00:01:43.267 crypto/null: not in enabled drivers build config 00:01:43.267 crypto/octeontx: not in enabled drivers build config 00:01:43.267 crypto/openssl: not in enabled drivers build config 00:01:43.267 crypto/scheduler: not in enabled drivers build config 00:01:43.267 crypto/uadk: not in enabled drivers build config 00:01:43.267 crypto/virtio: not in enabled drivers build config 00:01:43.267 compress/nitrox: not in enabled drivers build config 00:01:43.267 compress/octeontx: not in enabled drivers build config 00:01:43.267 compress/zlib: not in enabled drivers build config 00:01:43.267 regex/*: missing internal dependency, "regexdev" 00:01:43.267 ml/*: missing internal dependency, "mldev" 00:01:43.267 vdpa/ifc: not in enabled drivers build config 00:01:43.267 vdpa/mlx5: not in enabled drivers build config 00:01:43.267 vdpa/nfp: not in enabled drivers build config 00:01:43.267 vdpa/sfc: not in enabled drivers build config 00:01:43.267 event/*: missing internal dependency, "eventdev" 00:01:43.267 baseband/*: missing internal dependency, "bbdev" 00:01:43.267 gpu/*: missing internal dependency, "gpudev" 00:01:43.267 00:01:43.267 00:01:43.836 Build targets in project: 118 00:01:43.836 00:01:43.836 DPDK 24.03.0 00:01:43.836 00:01:43.836 User defined options 00:01:43.836 buildtype : debug 00:01:43.836 default_library : shared 00:01:43.836 libdir : lib 00:01:43.836 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:43.836 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:43.836 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:43.836 cpu_instruction_set: native 00:01:43.836 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:43.836 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,pipeline,bbdev,table,metrics,member,jobstats,efd,rib,argparse 00:01:43.836 enable_docs : false 00:01:43.836 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:43.836 enable_kmods : false 00:01:43.836 tests : false 00:01:43.836 00:01:43.836 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:44.096 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:44.365 [1/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:44.365 [2/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:44.365 [3/384] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:44.365 [4/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:44.365 [5/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:44.365 [6/384] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:44.365 [7/384] Linking static target lib/librte_kvargs.a 00:01:44.365 [8/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:44.365 [9/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:44.365 [10/384] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:44.365 [11/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:44.365 [12/384] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:44.365 [13/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:44.365 [14/384] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:44.365 [15/384] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:44.365 [16/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:44.365 [17/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:44.365 [18/384] Linking static target lib/librte_log.a 00:01:44.365 [19/384] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:44.965 [20/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:44.965 [21/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:44.965 [22/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:44.965 [23/384] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.965 [24/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:44.965 [25/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:44.965 [26/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:44.965 [27/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:44.965 [28/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:44.965 [29/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:44.965 [30/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:44.965 [31/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:44.965 [32/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:44.965 [33/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:44.965 [34/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:44.965 [35/384] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:44.965 [36/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:44.965 [37/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:44.965 [38/384] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:44.965 [39/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:44.965 [40/384] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:44.965 [41/384] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:44.965 [42/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:44.965 [43/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:44.965 [44/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:44.965 [45/384] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:44.965 [46/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:44.965 [47/384] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:44.965 [48/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:44.965 [49/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:44.965 [50/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:44.965 [51/384] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:44.965 [52/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:44.965 [53/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:44.965 [54/384] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:44.965 [55/384] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:44.965 [56/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:44.965 [57/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:44.965 [58/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:44.965 [59/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:44.965 [60/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:44.965 [61/384] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:44.965 [62/384] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:44.965 [63/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:44.965 [64/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:44.965 [65/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:44.965 [66/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:44.965 [67/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:44.965 [68/384] Linking static target lib/librte_telemetry.a 00:01:44.965 [69/384] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:44.965 [70/384] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:44.965 [71/384] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:44.965 [72/384] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:44.965 [73/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:44.965 [74/384] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:44.965 [75/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:44.965 [76/384] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:44.965 [77/384] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:44.965 [78/384] Linking static target lib/librte_pci.a 00:01:45.242 [79/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:45.242 [80/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:45.242 [81/384] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:45.242 [82/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:45.242 [83/384] Linking static target lib/librte_ring.a 00:01:45.242 [84/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:45.242 [85/384] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:45.242 [86/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:45.242 [87/384] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:45.242 [88/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:45.242 [89/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:45.242 [90/384] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:45.242 [91/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:45.242 [92/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:45.242 [93/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:45.242 [94/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:45.242 [95/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:45.242 [96/384] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:45.242 [97/384] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:45.242 [98/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:45.242 [99/384] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:45.242 [100/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:45.242 [101/384] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:45.242 [102/384] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:45.242 [103/384] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:45.242 [104/384] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:45.242 [105/384] Linking static target lib/librte_mempool.a 00:01:45.242 [106/384] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:45.242 [107/384] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:45.242 [108/384] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:45.242 [109/384] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:45.242 [110/384] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:45.242 [111/384] Linking static target lib/librte_rcu.a 00:01:45.242 [112/384] Linking static target lib/librte_eal.a 00:01:45.242 [113/384] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:45.242 [114/384] Linking static target lib/librte_net.a 00:01:45.242 [115/384] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:45.242 [116/384] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.242 [117/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:45.242 [118/384] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:45.506 [119/384] Linking static target lib/librte_meter.a 00:01:45.506 [120/384] Linking target lib/librte_log.so.24.1 00:01:45.506 [121/384] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.506 [122/384] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:45.506 [123/384] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:45.506 [124/384] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:45.506 [125/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:45.506 [126/384] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:45.506 [127/384] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:45.506 [128/384] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.507 [129/384] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:45.507 [130/384] Linking static target lib/librte_mbuf.a 00:01:45.507 [131/384] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:45.507 [132/384] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:45.771 [133/384] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:45.771 [134/384] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:45.771 [135/384] Linking target lib/librte_kvargs.so.24.1 00:01:45.771 [136/384] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:45.771 [137/384] Linking static target lib/librte_cmdline.a 00:01:45.771 [138/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:45.771 [139/384] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:45.771 [140/384] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:45.771 [141/384] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:45.771 [142/384] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:45.771 [143/384] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:45.771 [144/384] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:45.771 [145/384] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:45.771 [146/384] Linking static target lib/librte_timer.a 00:01:45.771 [147/384] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:45.771 [148/384] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:45.771 [149/384] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:45.771 [150/384] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:45.771 [151/384] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:45.771 [152/384] Linking static target lib/librte_stack.a 00:01:45.771 [153/384] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:45.771 [154/384] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:45.771 [155/384] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:45.771 [156/384] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.771 [157/384] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:45.771 [158/384] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.771 [159/384] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:45.771 [160/384] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:45.771 [161/384] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.771 [162/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:45.771 [163/384] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.771 [164/384] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:45.771 [165/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:45.771 [166/384] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:45.771 [167/384] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:45.771 [168/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:45.771 [169/384] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:45.771 [170/384] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:45.771 [171/384] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:45.771 [172/384] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:45.771 [173/384] Linking target lib/librte_telemetry.so.24.1 00:01:46.036 [174/384] Linking static target lib/librte_dmadev.a 00:01:46.036 [175/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:46.036 [176/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:46.036 [177/384] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:46.036 [178/384] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:46.036 [179/384] Linking static target lib/librte_compressdev.a 00:01:46.036 [180/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:46.036 [181/384] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:46.036 [182/384] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:46.036 [183/384] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:46.036 [184/384] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:46.036 [185/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:46.036 [186/384] Linking static target lib/librte_reorder.a 00:01:46.036 [187/384] Linking static target lib/librte_power.a 00:01:46.036 [188/384] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:46.036 [189/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:46.036 [190/384] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:46.036 [191/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:46.036 [192/384] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:46.036 [193/384] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:46.036 [194/384] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:46.036 [195/384] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:46.036 [196/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:46.036 [197/384] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:46.037 [198/384] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:46.037 [199/384] Linking static target lib/librte_security.a 00:01:46.037 [200/384] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:46.037 [201/384] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.295 [202/384] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:46.295 [203/384] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:46.295 [204/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:46.295 [205/384] Linking static target drivers/librte_bus_auxiliary.a 00:01:46.295 [206/384] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:46.295 [207/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:46.295 [208/384] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:46.295 [209/384] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:46.295 [210/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:46.295 [211/384] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:46.295 [212/384] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.295 [213/384] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:46.295 [214/384] Linking static target lib/librte_hash.a 00:01:46.295 [215/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:46.295 [216/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:46.554 [217/384] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.554 [218/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:46.554 [219/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:46.554 [220/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:46.554 [221/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:46.554 [222/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:46.554 [223/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:46.554 [224/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:01:46.554 [225/384] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:46.554 [226/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:46.554 [227/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:46.554 [228/384] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:46.554 [229/384] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:46.554 [230/384] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:46.554 [231/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:46.554 [232/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:46.554 [233/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:46.554 [234/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:46.554 [235/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:46.554 [236/384] Linking static target drivers/librte_bus_vdev.a 00:01:46.554 [237/384] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:46.554 [238/384] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.554 [239/384] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.554 [240/384] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:46.554 [241/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:01:46.554 [242/384] Linking static target drivers/librte_bus_pci.a 00:01:46.554 [243/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:01:46.554 [244/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:46.554 [245/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:46.554 [246/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:46.554 [247/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:46.554 [248/384] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.554 [249/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:46.554 [250/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:46.554 [251/384] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:46.554 [252/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:46.554 [253/384] Linking static target lib/librte_cryptodev.a 00:01:46.554 [254/384] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.811 [255/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:46.811 [256/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:01:46.811 [257/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:46.811 [258/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:46.811 [259/384] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.811 [260/384] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.811 [261/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:46.811 [262/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:46.811 [263/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:46.811 [264/384] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:46.811 [265/384] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:46.811 [266/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:01:46.811 [267/384] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:46.811 [268/384] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.811 [269/384] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.069 [270/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:47.069 [271/384] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.069 [272/384] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:47.069 [273/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:47.069 [274/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:47.069 [275/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:47.069 [276/384] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:47.069 [277/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:47.069 [278/384] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:47.069 [279/384] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:47.069 [280/384] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:01:47.069 [281/384] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:47.069 [282/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:47.069 [283/384] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:47.069 [284/384] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:47.069 [285/384] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:47.069 [286/384] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:47.069 [287/384] Linking static target drivers/librte_mempool_ring.a 00:01:47.069 [288/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:47.069 [289/384] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:01:47.069 [290/384] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:47.069 [291/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:47.069 [292/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:47.069 [293/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:47.069 [294/384] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:47.069 [295/384] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:47.069 [296/384] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:47.069 [297/384] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:47.328 [298/384] Linking static target lib/librte_ethdev.a 00:01:47.328 [299/384] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:47.328 [300/384] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:47.328 [301/384] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:47.328 [302/384] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.328 [303/384] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:47.328 [304/384] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.328 [305/384] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:47.328 [306/384] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:01:47.328 [307/384] Linking static target drivers/librte_compress_mlx5.a 00:01:47.328 [308/384] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:01:47.328 [309/384] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:47.328 [310/384] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:47.328 [311/384] Linking static target drivers/librte_crypto_mlx5.a 00:01:47.328 [312/384] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:47.328 [313/384] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:47.328 [314/384] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:47.328 [315/384] Linking static target drivers/librte_common_mlx5.a 00:01:47.328 [316/384] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:47.328 [317/384] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:47.586 [318/384] Linking static target drivers/librte_compress_isal.a 00:01:47.586 [319/384] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:01:47.586 [320/384] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:47.586 [321/384] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:47.586 [322/384] Linking static target drivers/librte_crypto_ipsec_mb.a 00:01:47.845 [323/384] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:01:48.104 [324/384] Linking static target drivers/libtmp_rte_common_qat.a 00:01:48.364 [325/384] Generating drivers/rte_common_qat.pmd.c with a custom command 00:01:48.364 [326/384] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:48.364 [327/384] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:48.364 [328/384] Linking static target drivers/librte_common_qat.a 00:01:48.623 [329/384] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:48.623 [330/384] Linking static target lib/librte_vhost.a 00:01:48.882 [331/384] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.421 [332/384] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.957 [333/384] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.248 [334/384] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.626 [335/384] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.884 [336/384] Linking target lib/librte_eal.so.24.1 00:01:58.884 [337/384] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:01:58.884 [338/384] Linking target lib/librte_pci.so.24.1 00:01:58.884 [339/384] Linking target lib/librte_timer.so.24.1 00:01:58.884 [340/384] Linking target lib/librte_ring.so.24.1 00:01:58.884 [341/384] Linking target lib/librte_meter.so.24.1 00:01:58.884 [342/384] Linking target drivers/librte_bus_vdev.so.24.1 00:01:58.884 [343/384] Linking target drivers/librte_bus_auxiliary.so.24.1 00:01:58.884 [344/384] Linking target lib/librte_stack.so.24.1 00:01:58.884 [345/384] Linking target lib/librte_dmadev.so.24.1 00:01:59.142 [346/384] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:01:59.142 [347/384] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:01:59.142 [348/384] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:01:59.143 [349/384] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:01:59.143 [350/384] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:01:59.143 [351/384] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:01:59.143 [352/384] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:01:59.143 [353/384] Linking target drivers/librte_bus_pci.so.24.1 00:01:59.143 [354/384] Linking target lib/librte_rcu.so.24.1 00:01:59.143 [355/384] Linking target lib/librte_mempool.so.24.1 00:01:59.401 [356/384] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:01:59.401 [357/384] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:01:59.401 [358/384] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:01:59.401 [359/384] Linking target drivers/librte_mempool_ring.so.24.1 00:01:59.401 [360/384] Linking target lib/librte_mbuf.so.24.1 00:01:59.659 [361/384] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:01:59.659 [362/384] Linking target lib/librte_reorder.so.24.1 00:01:59.659 [363/384] Linking target lib/librte_compressdev.so.24.1 00:01:59.660 [364/384] Linking target lib/librte_net.so.24.1 00:01:59.660 [365/384] Linking target lib/librte_cryptodev.so.24.1 00:01:59.918 [366/384] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:01:59.918 [367/384] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:01:59.918 [368/384] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:01:59.918 [369/384] Linking target lib/librte_hash.so.24.1 00:01:59.918 [370/384] Linking target lib/librte_security.so.24.1 00:01:59.918 [371/384] Linking target lib/librte_cmdline.so.24.1 00:01:59.918 [372/384] Linking target drivers/librte_compress_isal.so.24.1 00:01:59.918 [373/384] Linking target lib/librte_ethdev.so.24.1 00:02:00.176 [374/384] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:00.176 [375/384] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:00.176 [376/384] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:00.176 [377/384] Linking target drivers/librte_common_mlx5.so.24.1 00:02:00.176 [378/384] Linking target lib/librte_power.so.24.1 00:02:00.176 [379/384] Linking target lib/librte_vhost.so.24.1 00:02:00.176 [380/384] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:00.435 [381/384] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:00.435 [382/384] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:00.435 [383/384] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:00.435 [384/384] Linking target drivers/librte_common_qat.so.24.1 00:02:00.435 INFO: autodetecting backend as ninja 00:02:00.435 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:01.812 CC lib/log/log.o 00:02:01.812 CC lib/log/log_flags.o 00:02:01.812 CC lib/log/log_deprecated.o 00:02:01.812 CC lib/ut/ut.o 00:02:01.812 CC lib/ut_mock/mock.o 00:02:02.071 LIB libspdk_ut_mock.a 00:02:02.071 LIB libspdk_log.a 00:02:02.071 LIB libspdk_ut.a 00:02:02.071 SO libspdk_ut_mock.so.6.0 00:02:02.071 SO libspdk_log.so.7.0 00:02:02.071 SO libspdk_ut.so.2.0 00:02:02.071 SYMLINK libspdk_ut_mock.so 00:02:02.071 SYMLINK libspdk_ut.so 00:02:02.071 SYMLINK libspdk_log.so 00:02:02.336 CC lib/util/base64.o 00:02:02.336 CC lib/util/bit_array.o 00:02:02.668 CC lib/util/cpuset.o 00:02:02.668 CC lib/util/crc16.o 00:02:02.668 CC lib/util/crc32.o 00:02:02.668 CC lib/util/crc32c.o 00:02:02.668 CC lib/dma/dma.o 00:02:02.668 CC lib/ioat/ioat.o 00:02:02.668 CC lib/util/crc32_ieee.o 00:02:02.668 CXX lib/trace_parser/trace.o 00:02:02.668 CC lib/util/crc64.o 00:02:02.668 CC lib/util/dif.o 00:02:02.668 CC lib/util/fd.o 00:02:02.668 CC lib/util/file.o 00:02:02.668 CC lib/util/hexlify.o 00:02:02.668 CC lib/util/iov.o 00:02:02.668 CC lib/util/math.o 00:02:02.668 CC lib/util/pipe.o 00:02:02.668 CC lib/util/strerror_tls.o 00:02:02.668 CC lib/util/string.o 00:02:02.668 CC lib/util/uuid.o 00:02:02.668 CC lib/util/fd_group.o 00:02:02.668 CC lib/util/xor.o 00:02:02.668 CC lib/util/zipf.o 00:02:02.668 CC lib/vfio_user/host/vfio_user_pci.o 00:02:02.668 CC lib/vfio_user/host/vfio_user.o 00:02:02.668 LIB libspdk_dma.a 00:02:02.931 SO libspdk_dma.so.4.0 00:02:02.931 LIB libspdk_ioat.a 00:02:02.931 SYMLINK libspdk_dma.so 00:02:02.931 SO libspdk_ioat.so.7.0 00:02:02.931 LIB libspdk_util.a 00:02:02.931 SYMLINK libspdk_ioat.so 00:02:02.931 LIB libspdk_vfio_user.a 00:02:02.931 SO libspdk_vfio_user.so.5.0 00:02:02.931 SO libspdk_util.so.9.0 00:02:02.931 SYMLINK libspdk_vfio_user.so 00:02:03.189 SYMLINK libspdk_util.so 00:02:03.447 LIB libspdk_trace_parser.a 00:02:03.447 SO libspdk_trace_parser.so.5.0 00:02:03.447 CC lib/conf/conf.o 00:02:03.447 CC lib/rdma/common.o 00:02:03.447 CC lib/rdma/rdma_verbs.o 00:02:03.447 CC lib/vmd/vmd.o 00:02:03.447 CC lib/vmd/led.o 00:02:03.447 CC lib/json/json_parse.o 00:02:03.447 CC lib/reduce/reduce.o 00:02:03.447 CC lib/json/json_util.o 00:02:03.447 CC lib/idxd/idxd_user.o 00:02:03.447 CC lib/env_dpdk/env.o 00:02:03.447 CC lib/idxd/idxd.o 00:02:03.447 CC lib/json/json_write.o 00:02:03.447 CC lib/env_dpdk/memory.o 00:02:03.447 CC lib/env_dpdk/pci.o 00:02:03.447 CC lib/env_dpdk/init.o 00:02:03.447 CC lib/env_dpdk/threads.o 00:02:03.447 CC lib/env_dpdk/pci_virtio.o 00:02:03.447 CC lib/env_dpdk/pci_ioat.o 00:02:03.447 CC lib/env_dpdk/pci_vmd.o 00:02:03.447 CC lib/env_dpdk/pci_idxd.o 00:02:03.447 CC lib/env_dpdk/pci_event.o 00:02:03.447 CC lib/env_dpdk/sigbus_handler.o 00:02:03.447 CC lib/env_dpdk/pci_dpdk.o 00:02:03.447 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:03.447 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:03.707 SYMLINK libspdk_trace_parser.so 00:02:03.707 LIB libspdk_conf.a 00:02:03.966 SO libspdk_conf.so.6.0 00:02:03.966 LIB libspdk_rdma.a 00:02:03.966 LIB libspdk_json.a 00:02:03.966 SYMLINK libspdk_conf.so 00:02:03.966 SO libspdk_rdma.so.6.0 00:02:03.966 SO libspdk_json.so.6.0 00:02:03.966 SYMLINK libspdk_rdma.so 00:02:03.966 SYMLINK libspdk_json.so 00:02:04.226 LIB libspdk_idxd.a 00:02:04.226 SO libspdk_idxd.so.12.0 00:02:04.226 LIB libspdk_vmd.a 00:02:04.226 LIB libspdk_reduce.a 00:02:04.226 SO libspdk_vmd.so.6.0 00:02:04.226 SYMLINK libspdk_idxd.so 00:02:04.226 SO libspdk_reduce.so.6.0 00:02:04.226 SYMLINK libspdk_vmd.so 00:02:04.226 SYMLINK libspdk_reduce.so 00:02:04.485 CC lib/jsonrpc/jsonrpc_server.o 00:02:04.485 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:04.485 CC lib/jsonrpc/jsonrpc_client.o 00:02:04.485 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:04.744 LIB libspdk_jsonrpc.a 00:02:04.744 SO libspdk_jsonrpc.so.6.0 00:02:04.744 SYMLINK libspdk_jsonrpc.so 00:02:05.003 LIB libspdk_env_dpdk.a 00:02:05.003 SO libspdk_env_dpdk.so.14.0 00:02:05.261 CC lib/rpc/rpc.o 00:02:05.261 SYMLINK libspdk_env_dpdk.so 00:02:05.261 LIB libspdk_rpc.a 00:02:05.519 SO libspdk_rpc.so.6.0 00:02:05.519 SYMLINK libspdk_rpc.so 00:02:05.778 CC lib/notify/notify.o 00:02:05.778 CC lib/notify/notify_rpc.o 00:02:05.778 CC lib/trace/trace.o 00:02:05.778 CC lib/keyring/keyring.o 00:02:05.778 CC lib/trace/trace_flags.o 00:02:05.778 CC lib/keyring/keyring_rpc.o 00:02:05.778 CC lib/trace/trace_rpc.o 00:02:06.037 LIB libspdk_notify.a 00:02:06.037 SO libspdk_notify.so.6.0 00:02:06.037 LIB libspdk_keyring.a 00:02:06.295 LIB libspdk_trace.a 00:02:06.295 SYMLINK libspdk_notify.so 00:02:06.295 SO libspdk_keyring.so.1.0 00:02:06.295 SO libspdk_trace.so.10.0 00:02:06.295 SYMLINK libspdk_keyring.so 00:02:06.295 SYMLINK libspdk_trace.so 00:02:06.554 CC lib/thread/thread.o 00:02:06.554 CC lib/sock/sock.o 00:02:06.554 CC lib/thread/iobuf.o 00:02:06.554 CC lib/sock/sock_rpc.o 00:02:07.121 LIB libspdk_sock.a 00:02:07.121 SO libspdk_sock.so.9.0 00:02:07.121 SYMLINK libspdk_sock.so 00:02:07.687 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:07.687 CC lib/nvme/nvme_ctrlr.o 00:02:07.687 CC lib/nvme/nvme_ns_cmd.o 00:02:07.687 CC lib/nvme/nvme_fabric.o 00:02:07.687 CC lib/nvme/nvme_ns.o 00:02:07.687 CC lib/nvme/nvme_pcie_common.o 00:02:07.688 CC lib/nvme/nvme_pcie.o 00:02:07.688 CC lib/nvme/nvme_qpair.o 00:02:07.688 CC lib/nvme/nvme.o 00:02:07.688 CC lib/nvme/nvme_quirks.o 00:02:07.688 CC lib/nvme/nvme_transport.o 00:02:07.688 CC lib/nvme/nvme_discovery.o 00:02:07.688 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:07.688 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:07.688 CC lib/nvme/nvme_tcp.o 00:02:07.688 CC lib/nvme/nvme_opal.o 00:02:07.688 CC lib/nvme/nvme_io_msg.o 00:02:07.688 CC lib/nvme/nvme_poll_group.o 00:02:07.688 CC lib/nvme/nvme_zns.o 00:02:07.688 CC lib/nvme/nvme_stubs.o 00:02:07.688 CC lib/nvme/nvme_auth.o 00:02:07.688 CC lib/nvme/nvme_cuse.o 00:02:07.688 CC lib/nvme/nvme_rdma.o 00:02:08.253 LIB libspdk_thread.a 00:02:08.253 SO libspdk_thread.so.10.0 00:02:08.253 SYMLINK libspdk_thread.so 00:02:08.820 CC lib/virtio/virtio.o 00:02:08.820 CC lib/virtio/virtio_vhost_user.o 00:02:08.820 CC lib/virtio/virtio_vfio_user.o 00:02:08.820 CC lib/virtio/virtio_pci.o 00:02:08.820 CC lib/init/json_config.o 00:02:08.820 CC lib/init/subsystem.o 00:02:08.820 CC lib/blob/blobstore.o 00:02:08.820 CC lib/init/rpc.o 00:02:08.820 CC lib/init/subsystem_rpc.o 00:02:08.820 CC lib/blob/request.o 00:02:08.820 CC lib/blob/zeroes.o 00:02:08.820 CC lib/blob/blob_bs_dev.o 00:02:08.820 CC lib/accel/accel.o 00:02:08.820 CC lib/accel/accel_rpc.o 00:02:08.820 CC lib/accel/accel_sw.o 00:02:09.078 LIB libspdk_init.a 00:02:09.078 LIB libspdk_virtio.a 00:02:09.078 SO libspdk_init.so.5.0 00:02:09.078 SO libspdk_virtio.so.7.0 00:02:09.078 SYMLINK libspdk_init.so 00:02:09.078 SYMLINK libspdk_virtio.so 00:02:09.337 CC lib/event/app.o 00:02:09.337 CC lib/event/reactor.o 00:02:09.337 CC lib/event/log_rpc.o 00:02:09.337 CC lib/event/scheduler_static.o 00:02:09.337 CC lib/event/app_rpc.o 00:02:09.596 LIB libspdk_accel.a 00:02:09.596 SO libspdk_accel.so.15.0 00:02:09.855 LIB libspdk_nvme.a 00:02:09.855 SYMLINK libspdk_accel.so 00:02:09.855 LIB libspdk_event.a 00:02:09.855 SO libspdk_nvme.so.13.0 00:02:09.855 SO libspdk_event.so.13.0 00:02:10.113 SYMLINK libspdk_event.so 00:02:10.113 CC lib/bdev/bdev.o 00:02:10.113 CC lib/bdev/bdev_rpc.o 00:02:10.113 CC lib/bdev/part.o 00:02:10.113 CC lib/bdev/bdev_zone.o 00:02:10.113 CC lib/bdev/scsi_nvme.o 00:02:10.372 SYMLINK libspdk_nvme.so 00:02:11.748 LIB libspdk_blob.a 00:02:11.748 SO libspdk_blob.so.11.0 00:02:11.748 SYMLINK libspdk_blob.so 00:02:12.314 CC lib/lvol/lvol.o 00:02:12.314 CC lib/blobfs/tree.o 00:02:12.314 CC lib/blobfs/blobfs.o 00:02:12.883 LIB libspdk_bdev.a 00:02:12.883 SO libspdk_bdev.so.15.0 00:02:12.883 SYMLINK libspdk_bdev.so 00:02:13.142 LIB libspdk_blobfs.a 00:02:13.142 SO libspdk_blobfs.so.10.0 00:02:13.142 LIB libspdk_lvol.a 00:02:13.142 SO libspdk_lvol.so.10.0 00:02:13.142 SYMLINK libspdk_blobfs.so 00:02:13.142 SYMLINK libspdk_lvol.so 00:02:13.411 CC lib/nbd/nbd_rpc.o 00:02:13.411 CC lib/nbd/nbd.o 00:02:13.411 CC lib/nvmf/ctrlr.o 00:02:13.411 CC lib/nvmf/ctrlr_discovery.o 00:02:13.411 CC lib/nvmf/ctrlr_bdev.o 00:02:13.411 CC lib/nvmf/nvmf.o 00:02:13.411 CC lib/nvmf/subsystem.o 00:02:13.411 CC lib/nvmf/transport.o 00:02:13.411 CC lib/nvmf/nvmf_rpc.o 00:02:13.411 CC lib/nvmf/rdma.o 00:02:13.411 CC lib/nvmf/tcp.o 00:02:13.411 CC lib/scsi/dev.o 00:02:13.411 CC lib/nvmf/stubs.o 00:02:13.411 CC lib/scsi/lun.o 00:02:13.411 CC lib/nvmf/auth.o 00:02:13.411 CC lib/scsi/port.o 00:02:13.411 CC lib/scsi/scsi.o 00:02:13.411 CC lib/scsi/scsi_bdev.o 00:02:13.411 CC lib/ftl/ftl_core.o 00:02:13.411 CC lib/ublk/ublk.o 00:02:13.411 CC lib/ftl/ftl_init.o 00:02:13.411 CC lib/scsi/scsi_pr.o 00:02:13.411 CC lib/ublk/ublk_rpc.o 00:02:13.411 CC lib/scsi/scsi_rpc.o 00:02:13.411 CC lib/ftl/ftl_layout.o 00:02:13.411 CC lib/ftl/ftl_io.o 00:02:13.411 CC lib/scsi/task.o 00:02:13.411 CC lib/ftl/ftl_debug.o 00:02:13.411 CC lib/ftl/ftl_sb.o 00:02:13.411 CC lib/ftl/ftl_l2p.o 00:02:13.411 CC lib/ftl/ftl_nv_cache.o 00:02:13.411 CC lib/ftl/ftl_l2p_flat.o 00:02:13.411 CC lib/ftl/ftl_band_ops.o 00:02:13.411 CC lib/ftl/ftl_band.o 00:02:13.411 CC lib/ftl/ftl_rq.o 00:02:13.411 CC lib/ftl/ftl_writer.o 00:02:13.411 CC lib/ftl/ftl_reloc.o 00:02:13.411 CC lib/ftl/ftl_l2p_cache.o 00:02:13.411 CC lib/ftl/ftl_p2l.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:13.411 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:13.411 CC lib/ftl/utils/ftl_conf.o 00:02:13.411 CC lib/ftl/utils/ftl_md.o 00:02:13.411 CC lib/ftl/utils/ftl_bitmap.o 00:02:13.411 CC lib/ftl/utils/ftl_property.o 00:02:13.411 CC lib/ftl/utils/ftl_mempool.o 00:02:13.411 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:13.411 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:13.411 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:13.411 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:13.411 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:13.411 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:13.411 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:13.411 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:13.411 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:13.411 CC lib/ftl/base/ftl_base_dev.o 00:02:13.411 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:13.411 CC lib/ftl/base/ftl_base_bdev.o 00:02:13.411 CC lib/ftl/ftl_trace.o 00:02:13.978 LIB libspdk_nbd.a 00:02:13.978 SO libspdk_nbd.so.7.0 00:02:13.978 LIB libspdk_scsi.a 00:02:13.978 SYMLINK libspdk_nbd.so 00:02:14.236 SO libspdk_scsi.so.9.0 00:02:14.236 SYMLINK libspdk_scsi.so 00:02:14.236 LIB libspdk_ublk.a 00:02:14.236 SO libspdk_ublk.so.3.0 00:02:14.494 SYMLINK libspdk_ublk.so 00:02:14.494 LIB libspdk_ftl.a 00:02:14.494 CC lib/iscsi/conn.o 00:02:14.494 CC lib/iscsi/init_grp.o 00:02:14.494 CC lib/iscsi/md5.o 00:02:14.494 CC lib/iscsi/iscsi.o 00:02:14.494 CC lib/vhost/vhost.o 00:02:14.494 CC lib/iscsi/param.o 00:02:14.494 CC lib/vhost/vhost_rpc.o 00:02:14.494 CC lib/iscsi/portal_grp.o 00:02:14.494 CC lib/vhost/vhost_scsi.o 00:02:14.494 CC lib/iscsi/tgt_node.o 00:02:14.494 CC lib/vhost/vhost_blk.o 00:02:14.494 CC lib/iscsi/iscsi_subsystem.o 00:02:14.494 CC lib/vhost/rte_vhost_user.o 00:02:14.494 CC lib/iscsi/iscsi_rpc.o 00:02:14.494 CC lib/iscsi/task.o 00:02:14.752 SO libspdk_ftl.so.9.0 00:02:15.318 SYMLINK libspdk_ftl.so 00:02:15.318 LIB libspdk_nvmf.a 00:02:15.318 SO libspdk_nvmf.so.18.0 00:02:15.577 SYMLINK libspdk_nvmf.so 00:02:15.577 LIB libspdk_iscsi.a 00:02:15.577 LIB libspdk_vhost.a 00:02:15.836 SO libspdk_iscsi.so.8.0 00:02:15.836 SO libspdk_vhost.so.8.0 00:02:15.836 SYMLINK libspdk_vhost.so 00:02:15.836 SYMLINK libspdk_iscsi.so 00:02:16.403 CC module/env_dpdk/env_dpdk_rpc.o 00:02:16.662 CC module/blob/bdev/blob_bdev.o 00:02:16.662 LIB libspdk_env_dpdk_rpc.a 00:02:16.662 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:16.662 CC module/accel/ioat/accel_ioat.o 00:02:16.662 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:16.662 CC module/accel/ioat/accel_ioat_rpc.o 00:02:16.662 CC module/keyring/file/keyring.o 00:02:16.662 CC module/keyring/file/keyring_rpc.o 00:02:16.662 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:16.662 CC module/accel/error/accel_error.o 00:02:16.662 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:16.662 CC module/accel/error/accel_error_rpc.o 00:02:16.662 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:16.662 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:16.662 CC module/scheduler/gscheduler/gscheduler.o 00:02:16.662 CC module/sock/posix/posix.o 00:02:16.662 CC module/accel/dsa/accel_dsa.o 00:02:16.662 CC module/accel/iaa/accel_iaa_rpc.o 00:02:16.662 CC module/accel/dsa/accel_dsa_rpc.o 00:02:16.662 CC module/accel/iaa/accel_iaa.o 00:02:16.662 SO libspdk_env_dpdk_rpc.so.6.0 00:02:16.662 SYMLINK libspdk_env_dpdk_rpc.so 00:02:16.922 LIB libspdk_keyring_file.a 00:02:16.922 LIB libspdk_scheduler_dpdk_governor.a 00:02:16.922 LIB libspdk_scheduler_gscheduler.a 00:02:16.922 SO libspdk_keyring_file.so.1.0 00:02:16.922 LIB libspdk_accel_error.a 00:02:16.922 LIB libspdk_scheduler_dynamic.a 00:02:16.922 LIB libspdk_accel_ioat.a 00:02:16.922 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:16.922 SO libspdk_scheduler_dynamic.so.4.0 00:02:16.922 SO libspdk_scheduler_gscheduler.so.4.0 00:02:16.922 LIB libspdk_accel_iaa.a 00:02:16.922 SO libspdk_accel_error.so.2.0 00:02:16.922 SO libspdk_accel_ioat.so.6.0 00:02:16.922 SYMLINK libspdk_keyring_file.so 00:02:16.922 LIB libspdk_accel_dsa.a 00:02:16.922 LIB libspdk_blob_bdev.a 00:02:16.922 SO libspdk_accel_iaa.so.3.0 00:02:16.922 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:16.922 SYMLINK libspdk_scheduler_dynamic.so 00:02:16.922 SO libspdk_blob_bdev.so.11.0 00:02:16.922 SYMLINK libspdk_scheduler_gscheduler.so 00:02:16.922 SO libspdk_accel_dsa.so.5.0 00:02:16.922 SYMLINK libspdk_accel_error.so 00:02:16.922 SYMLINK libspdk_accel_ioat.so 00:02:17.221 SYMLINK libspdk_accel_iaa.so 00:02:17.221 SYMLINK libspdk_blob_bdev.so 00:02:17.221 SYMLINK libspdk_accel_dsa.so 00:02:17.505 LIB libspdk_sock_posix.a 00:02:17.505 SO libspdk_sock_posix.so.6.0 00:02:17.505 CC module/bdev/gpt/gpt.o 00:02:17.505 CC module/bdev/gpt/vbdev_gpt.o 00:02:17.505 CC module/bdev/nvme/bdev_nvme.o 00:02:17.505 CC module/bdev/error/vbdev_error.o 00:02:17.505 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:17.505 CC module/bdev/error/vbdev_error_rpc.o 00:02:17.505 CC module/bdev/nvme/bdev_mdns_client.o 00:02:17.505 CC module/bdev/nvme/nvme_rpc.o 00:02:17.505 CC module/bdev/lvol/vbdev_lvol.o 00:02:17.505 CC module/bdev/nvme/vbdev_opal.o 00:02:17.505 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:17.505 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:17.505 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:17.505 CC module/blobfs/bdev/blobfs_bdev.o 00:02:17.505 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:17.505 CC module/bdev/split/vbdev_split.o 00:02:17.505 CC module/bdev/iscsi/bdev_iscsi.o 00:02:17.505 CC module/bdev/split/vbdev_split_rpc.o 00:02:17.505 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:17.505 CC module/bdev/delay/vbdev_delay.o 00:02:17.505 CC module/bdev/malloc/bdev_malloc.o 00:02:17.505 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:17.505 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:17.505 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:17.505 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:17.505 CC module/bdev/crypto/vbdev_crypto.o 00:02:17.505 CC module/bdev/null/bdev_null_rpc.o 00:02:17.505 CC module/bdev/null/bdev_null.o 00:02:17.505 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:17.505 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:17.505 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:17.505 CC module/bdev/ftl/bdev_ftl.o 00:02:17.505 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:17.506 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:17.506 CC module/bdev/aio/bdev_aio.o 00:02:17.506 CC module/bdev/passthru/vbdev_passthru.o 00:02:17.506 CC module/bdev/raid/bdev_raid.o 00:02:17.506 CC module/bdev/aio/bdev_aio_rpc.o 00:02:17.506 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:17.506 CC module/bdev/raid/bdev_raid_rpc.o 00:02:17.506 CC module/bdev/raid/bdev_raid_sb.o 00:02:17.506 CC module/bdev/raid/raid0.o 00:02:17.506 CC module/bdev/raid/raid1.o 00:02:17.506 CC module/bdev/raid/concat.o 00:02:17.506 CC module/bdev/compress/vbdev_compress.o 00:02:17.506 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:17.506 SYMLINK libspdk_sock_posix.so 00:02:17.763 LIB libspdk_bdev_split.a 00:02:17.763 LIB libspdk_bdev_error.a 00:02:17.763 LIB libspdk_blobfs_bdev.a 00:02:18.021 SO libspdk_bdev_split.so.6.0 00:02:18.021 LIB libspdk_accel_dpdk_compressdev.a 00:02:18.021 SO libspdk_bdev_error.so.6.0 00:02:18.021 LIB libspdk_bdev_gpt.a 00:02:18.021 SO libspdk_blobfs_bdev.so.6.0 00:02:18.021 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:18.021 SO libspdk_bdev_gpt.so.6.0 00:02:18.021 LIB libspdk_bdev_ftl.a 00:02:18.021 SYMLINK libspdk_bdev_split.so 00:02:18.021 LIB libspdk_bdev_iscsi.a 00:02:18.021 SYMLINK libspdk_bdev_error.so 00:02:18.021 LIB libspdk_bdev_zone_block.a 00:02:18.021 LIB libspdk_bdev_crypto.a 00:02:18.021 LIB libspdk_bdev_passthru.a 00:02:18.021 LIB libspdk_bdev_delay.a 00:02:18.021 SYMLINK libspdk_blobfs_bdev.so 00:02:18.021 SO libspdk_bdev_ftl.so.6.0 00:02:18.021 LIB libspdk_bdev_null.a 00:02:18.021 LIB libspdk_bdev_malloc.a 00:02:18.021 SO libspdk_bdev_iscsi.so.6.0 00:02:18.021 SO libspdk_bdev_crypto.so.6.0 00:02:18.021 SO libspdk_bdev_zone_block.so.6.0 00:02:18.021 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:18.021 SO libspdk_bdev_delay.so.6.0 00:02:18.021 SO libspdk_bdev_passthru.so.6.0 00:02:18.021 SYMLINK libspdk_bdev_gpt.so 00:02:18.021 LIB libspdk_bdev_aio.a 00:02:18.021 SO libspdk_bdev_null.so.6.0 00:02:18.021 SO libspdk_bdev_malloc.so.6.0 00:02:18.021 SYMLINK libspdk_bdev_ftl.so 00:02:18.021 LIB libspdk_accel_dpdk_cryptodev.a 00:02:18.021 SYMLINK libspdk_bdev_zone_block.so 00:02:18.021 SYMLINK libspdk_bdev_iscsi.so 00:02:18.021 SYMLINK libspdk_bdev_crypto.so 00:02:18.021 SO libspdk_bdev_aio.so.6.0 00:02:18.021 LIB libspdk_bdev_compress.a 00:02:18.021 SYMLINK libspdk_bdev_passthru.so 00:02:18.021 SYMLINK libspdk_bdev_delay.so 00:02:18.021 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:18.021 SYMLINK libspdk_bdev_malloc.so 00:02:18.280 SYMLINK libspdk_bdev_null.so 00:02:18.280 SO libspdk_bdev_compress.so.6.0 00:02:18.280 LIB libspdk_bdev_virtio.a 00:02:18.280 SYMLINK libspdk_bdev_aio.so 00:02:18.280 SO libspdk_bdev_virtio.so.6.0 00:02:18.280 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:18.280 SYMLINK libspdk_bdev_compress.so 00:02:18.280 LIB libspdk_bdev_lvol.a 00:02:18.280 SO libspdk_bdev_lvol.so.6.0 00:02:18.280 SYMLINK libspdk_bdev_virtio.so 00:02:18.280 SYMLINK libspdk_bdev_lvol.so 00:02:18.539 LIB libspdk_bdev_raid.a 00:02:18.539 SO libspdk_bdev_raid.so.6.0 00:02:18.807 SYMLINK libspdk_bdev_raid.so 00:02:20.183 LIB libspdk_bdev_nvme.a 00:02:20.183 SO libspdk_bdev_nvme.so.7.0 00:02:20.183 SYMLINK libspdk_bdev_nvme.so 00:02:21.120 CC module/event/subsystems/vmd/vmd.o 00:02:21.120 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:21.120 CC module/event/subsystems/scheduler/scheduler.o 00:02:21.120 CC module/event/subsystems/sock/sock.o 00:02:21.120 CC module/event/subsystems/iobuf/iobuf.o 00:02:21.120 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:21.120 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:21.120 CC module/event/subsystems/keyring/keyring.o 00:02:21.120 LIB libspdk_event_scheduler.a 00:02:21.120 LIB libspdk_event_vmd.a 00:02:21.120 LIB libspdk_event_keyring.a 00:02:21.120 LIB libspdk_event_sock.a 00:02:21.120 SO libspdk_event_scheduler.so.4.0 00:02:21.120 LIB libspdk_event_vhost_blk.a 00:02:21.120 LIB libspdk_event_iobuf.a 00:02:21.120 SO libspdk_event_keyring.so.1.0 00:02:21.120 SO libspdk_event_vmd.so.6.0 00:02:21.120 SO libspdk_event_sock.so.5.0 00:02:21.120 SO libspdk_event_vhost_blk.so.3.0 00:02:21.120 SO libspdk_event_iobuf.so.3.0 00:02:21.120 SYMLINK libspdk_event_scheduler.so 00:02:21.120 SYMLINK libspdk_event_keyring.so 00:02:21.120 SYMLINK libspdk_event_vmd.so 00:02:21.120 SYMLINK libspdk_event_sock.so 00:02:21.120 SYMLINK libspdk_event_vhost_blk.so 00:02:21.380 SYMLINK libspdk_event_iobuf.so 00:02:21.639 CC module/event/subsystems/accel/accel.o 00:02:21.639 LIB libspdk_event_accel.a 00:02:21.898 SO libspdk_event_accel.so.6.0 00:02:21.898 SYMLINK libspdk_event_accel.so 00:02:22.157 CC module/event/subsystems/bdev/bdev.o 00:02:22.416 LIB libspdk_event_bdev.a 00:02:22.416 SO libspdk_event_bdev.so.6.0 00:02:22.675 SYMLINK libspdk_event_bdev.so 00:02:22.933 CC module/event/subsystems/ublk/ublk.o 00:02:22.933 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:22.933 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:22.933 CC module/event/subsystems/nbd/nbd.o 00:02:22.933 CC module/event/subsystems/scsi/scsi.o 00:02:23.193 LIB libspdk_event_ublk.a 00:02:23.193 LIB libspdk_event_nbd.a 00:02:23.193 LIB libspdk_event_scsi.a 00:02:23.193 SO libspdk_event_nbd.so.6.0 00:02:23.193 SO libspdk_event_ublk.so.3.0 00:02:23.193 SO libspdk_event_scsi.so.6.0 00:02:23.193 LIB libspdk_event_nvmf.a 00:02:23.193 SYMLINK libspdk_event_nbd.so 00:02:23.193 SYMLINK libspdk_event_ublk.so 00:02:23.193 SO libspdk_event_nvmf.so.6.0 00:02:23.193 SYMLINK libspdk_event_scsi.so 00:02:23.193 SYMLINK libspdk_event_nvmf.so 00:02:23.761 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:23.761 CC module/event/subsystems/iscsi/iscsi.o 00:02:23.761 LIB libspdk_event_vhost_scsi.a 00:02:23.761 LIB libspdk_event_iscsi.a 00:02:23.761 SO libspdk_event_vhost_scsi.so.3.0 00:02:23.761 SO libspdk_event_iscsi.so.6.0 00:02:24.020 SYMLINK libspdk_event_vhost_scsi.so 00:02:24.020 SYMLINK libspdk_event_iscsi.so 00:02:24.020 SO libspdk.so.6.0 00:02:24.020 SYMLINK libspdk.so 00:02:24.589 CXX app/trace/trace.o 00:02:24.589 CC app/trace_record/trace_record.o 00:02:24.590 CC app/spdk_nvme_perf/perf.o 00:02:24.590 CC app/spdk_top/spdk_top.o 00:02:24.590 CC app/spdk_lspci/spdk_lspci.o 00:02:24.590 CC app/spdk_nvme_discover/discovery_aer.o 00:02:24.590 CC app/spdk_nvme_identify/identify.o 00:02:24.590 TEST_HEADER include/spdk/accel.h 00:02:24.590 TEST_HEADER include/spdk/accel_module.h 00:02:24.590 TEST_HEADER include/spdk/assert.h 00:02:24.590 TEST_HEADER include/spdk/barrier.h 00:02:24.590 TEST_HEADER include/spdk/bdev.h 00:02:24.590 TEST_HEADER include/spdk/base64.h 00:02:24.590 TEST_HEADER include/spdk/bdev_module.h 00:02:24.590 TEST_HEADER include/spdk/bdev_zone.h 00:02:24.590 TEST_HEADER include/spdk/bit_array.h 00:02:24.590 TEST_HEADER include/spdk/bit_pool.h 00:02:24.590 TEST_HEADER include/spdk/blob_bdev.h 00:02:24.590 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:24.590 CC test/rpc_client/rpc_client_test.o 00:02:24.590 TEST_HEADER include/spdk/blobfs.h 00:02:24.590 TEST_HEADER include/spdk/blob.h 00:02:24.590 TEST_HEADER include/spdk/conf.h 00:02:24.590 TEST_HEADER include/spdk/config.h 00:02:24.590 TEST_HEADER include/spdk/cpuset.h 00:02:24.590 TEST_HEADER include/spdk/crc16.h 00:02:24.590 TEST_HEADER include/spdk/crc32.h 00:02:24.590 TEST_HEADER include/spdk/crc64.h 00:02:24.590 TEST_HEADER include/spdk/dif.h 00:02:24.590 TEST_HEADER include/spdk/dma.h 00:02:24.590 TEST_HEADER include/spdk/endian.h 00:02:24.590 TEST_HEADER include/spdk/env_dpdk.h 00:02:24.590 TEST_HEADER include/spdk/env.h 00:02:24.590 TEST_HEADER include/spdk/event.h 00:02:24.590 CC app/nvmf_tgt/nvmf_main.o 00:02:24.590 TEST_HEADER include/spdk/fd_group.h 00:02:24.590 CC app/iscsi_tgt/iscsi_tgt.o 00:02:24.590 TEST_HEADER include/spdk/fd.h 00:02:24.590 TEST_HEADER include/spdk/file.h 00:02:24.590 CC app/spdk_dd/spdk_dd.o 00:02:24.590 TEST_HEADER include/spdk/ftl.h 00:02:24.590 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:24.590 TEST_HEADER include/spdk/gpt_spec.h 00:02:24.590 CC app/vhost/vhost.o 00:02:24.590 TEST_HEADER include/spdk/hexlify.h 00:02:24.590 TEST_HEADER include/spdk/histogram_data.h 00:02:24.590 TEST_HEADER include/spdk/idxd.h 00:02:24.590 TEST_HEADER include/spdk/idxd_spec.h 00:02:24.590 TEST_HEADER include/spdk/init.h 00:02:24.590 TEST_HEADER include/spdk/ioat.h 00:02:24.590 CC app/spdk_tgt/spdk_tgt.o 00:02:24.590 TEST_HEADER include/spdk/ioat_spec.h 00:02:24.590 TEST_HEADER include/spdk/iscsi_spec.h 00:02:24.590 TEST_HEADER include/spdk/json.h 00:02:24.590 TEST_HEADER include/spdk/jsonrpc.h 00:02:24.590 TEST_HEADER include/spdk/keyring.h 00:02:24.590 TEST_HEADER include/spdk/keyring_module.h 00:02:24.590 TEST_HEADER include/spdk/likely.h 00:02:24.852 TEST_HEADER include/spdk/log.h 00:02:24.852 CC test/app/histogram_perf/histogram_perf.o 00:02:24.852 CC examples/nvme/reconnect/reconnect.o 00:02:24.852 CC examples/nvme/arbitration/arbitration.o 00:02:24.852 TEST_HEADER include/spdk/lvol.h 00:02:24.852 CC app/fio/nvme/fio_plugin.o 00:02:24.852 CC examples/accel/perf/accel_perf.o 00:02:24.852 CC test/thread/poller_perf/poller_perf.o 00:02:24.852 CC examples/sock/hello_world/hello_sock.o 00:02:24.852 TEST_HEADER include/spdk/memory.h 00:02:24.852 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:24.852 CC test/nvme/startup/startup.o 00:02:24.852 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:24.852 TEST_HEADER include/spdk/mmio.h 00:02:24.852 CC examples/nvme/abort/abort.o 00:02:24.852 CC examples/nvme/hotplug/hotplug.o 00:02:24.852 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:24.852 CC test/nvme/reserve/reserve.o 00:02:24.852 CC examples/util/zipf/zipf.o 00:02:24.852 CC examples/idxd/perf/perf.o 00:02:24.852 CC examples/ioat/perf/perf.o 00:02:24.852 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:24.852 CC examples/ioat/verify/verify.o 00:02:24.852 CC test/nvme/aer/aer.o 00:02:24.852 TEST_HEADER include/spdk/nbd.h 00:02:24.852 CC test/nvme/reset/reset.o 00:02:24.852 CC examples/vmd/lsvmd/lsvmd.o 00:02:24.852 CC test/env/vtophys/vtophys.o 00:02:24.852 CC examples/nvme/hello_world/hello_world.o 00:02:24.852 CC test/nvme/overhead/overhead.o 00:02:24.852 CC test/nvme/sgl/sgl.o 00:02:24.852 CC test/nvme/connect_stress/connect_stress.o 00:02:24.852 TEST_HEADER include/spdk/notify.h 00:02:24.852 TEST_HEADER include/spdk/nvme.h 00:02:24.852 CC test/nvme/compliance/nvme_compliance.o 00:02:24.852 CC test/nvme/simple_copy/simple_copy.o 00:02:24.852 CC examples/vmd/led/led.o 00:02:24.852 TEST_HEADER include/spdk/nvme_intel.h 00:02:24.852 CC test/event/reactor/reactor.o 00:02:24.852 CC test/env/pci/pci_ut.o 00:02:24.852 CC test/env/memory/memory_ut.o 00:02:24.852 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:24.852 CC test/event/event_perf/event_perf.o 00:02:24.852 CC test/event/reactor_perf/reactor_perf.o 00:02:24.852 CC test/app/jsoncat/jsoncat.o 00:02:24.852 CC test/app/stub/stub.o 00:02:24.852 CC test/nvme/boot_partition/boot_partition.o 00:02:24.852 CC test/nvme/err_injection/err_injection.o 00:02:24.852 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:24.852 CC test/nvme/e2edp/nvme_dp.o 00:02:24.852 CC test/app/bdev_svc/bdev_svc.o 00:02:24.852 TEST_HEADER include/spdk/nvme_spec.h 00:02:24.852 TEST_HEADER include/spdk/nvme_zns.h 00:02:24.852 CC examples/bdev/hello_world/hello_bdev.o 00:02:24.852 CC test/event/app_repeat/app_repeat.o 00:02:24.852 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:24.852 CC app/fio/bdev/fio_plugin.o 00:02:24.852 CC examples/nvmf/nvmf/nvmf.o 00:02:24.852 CC test/accel/dif/dif.o 00:02:24.852 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:24.852 CC test/bdev/bdevio/bdevio.o 00:02:24.852 CC examples/blob/cli/blobcli.o 00:02:24.852 CC test/blobfs/mkfs/mkfs.o 00:02:24.852 TEST_HEADER include/spdk/nvmf.h 00:02:24.852 TEST_HEADER include/spdk/nvmf_spec.h 00:02:24.852 CC examples/blob/hello_world/hello_blob.o 00:02:24.852 TEST_HEADER include/spdk/nvmf_transport.h 00:02:24.852 CC test/dma/test_dma/test_dma.o 00:02:24.852 TEST_HEADER include/spdk/opal.h 00:02:24.852 TEST_HEADER include/spdk/opal_spec.h 00:02:24.852 CC examples/bdev/bdevperf/bdevperf.o 00:02:24.852 TEST_HEADER include/spdk/pci_ids.h 00:02:24.852 CC examples/thread/thread/thread_ex.o 00:02:24.852 TEST_HEADER include/spdk/pipe.h 00:02:24.852 TEST_HEADER include/spdk/queue.h 00:02:24.852 TEST_HEADER include/spdk/reduce.h 00:02:24.852 CC test/event/scheduler/scheduler.o 00:02:24.852 TEST_HEADER include/spdk/rpc.h 00:02:24.852 CC test/lvol/esnap/esnap.o 00:02:24.852 TEST_HEADER include/spdk/scheduler.h 00:02:24.852 LINK spdk_lspci 00:02:24.852 TEST_HEADER include/spdk/scsi.h 00:02:24.852 TEST_HEADER include/spdk/scsi_spec.h 00:02:24.852 TEST_HEADER include/spdk/sock.h 00:02:24.852 CC test/env/mem_callbacks/mem_callbacks.o 00:02:24.852 TEST_HEADER include/spdk/stdinc.h 00:02:24.852 TEST_HEADER include/spdk/string.h 00:02:24.852 TEST_HEADER include/spdk/thread.h 00:02:24.852 LINK rpc_client_test 00:02:24.852 TEST_HEADER include/spdk/trace.h 00:02:24.852 TEST_HEADER include/spdk/trace_parser.h 00:02:24.852 TEST_HEADER include/spdk/tree.h 00:02:24.852 TEST_HEADER include/spdk/ublk.h 00:02:25.115 LINK spdk_nvme_discover 00:02:25.115 TEST_HEADER include/spdk/util.h 00:02:25.115 TEST_HEADER include/spdk/uuid.h 00:02:25.115 TEST_HEADER include/spdk/version.h 00:02:25.115 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:25.115 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:25.115 TEST_HEADER include/spdk/vhost.h 00:02:25.115 TEST_HEADER include/spdk/vmd.h 00:02:25.115 LINK spdk_trace_record 00:02:25.115 TEST_HEADER include/spdk/xor.h 00:02:25.115 TEST_HEADER include/spdk/zipf.h 00:02:25.115 CXX test/cpp_headers/accel.o 00:02:25.115 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:25.115 LINK interrupt_tgt 00:02:25.115 LINK histogram_perf 00:02:25.115 LINK nvmf_tgt 00:02:25.115 LINK lsvmd 00:02:25.115 LINK reactor_perf 00:02:25.115 LINK zipf 00:02:25.115 LINK jsoncat 00:02:25.115 LINK event_perf 00:02:25.115 LINK vhost 00:02:25.115 LINK led 00:02:25.115 LINK iscsi_tgt 00:02:25.115 LINK startup 00:02:25.115 LINK vtophys 00:02:25.115 LINK poller_perf 00:02:25.115 LINK reactor 00:02:25.115 LINK cmb_copy 00:02:25.115 LINK env_dpdk_post_init 00:02:25.115 LINK connect_stress 00:02:25.115 LINK reserve 00:02:25.115 LINK pmr_persistence 00:02:25.115 LINK stub 00:02:25.115 LINK spdk_tgt 00:02:25.115 LINK boot_partition 00:02:25.115 LINK app_repeat 00:02:25.115 LINK bdev_svc 00:02:25.115 LINK hello_world 00:02:25.378 LINK hello_sock 00:02:25.378 LINK hotplug 00:02:25.378 LINK reset 00:02:25.378 LINK err_injection 00:02:25.378 LINK simple_copy 00:02:25.378 LINK verify 00:02:25.378 LINK ioat_perf 00:02:25.378 LINK spdk_trace 00:02:25.378 LINK hello_bdev 00:02:25.378 LINK mkfs 00:02:25.378 LINK sgl 00:02:25.378 LINK nvme_dp 00:02:25.378 LINK overhead 00:02:25.378 LINK aer 00:02:25.378 CXX test/cpp_headers/accel_module.o 00:02:25.378 LINK spdk_dd 00:02:25.378 LINK arbitration 00:02:25.378 LINK thread 00:02:25.378 LINK hello_blob 00:02:25.378 LINK idxd_perf 00:02:25.378 LINK scheduler 00:02:25.378 LINK reconnect 00:02:25.378 LINK nvme_compliance 00:02:25.378 LINK abort 00:02:25.378 CXX test/cpp_headers/assert.o 00:02:25.378 CXX test/cpp_headers/barrier.o 00:02:25.378 CXX test/cpp_headers/base64.o 00:02:25.378 LINK nvmf 00:02:25.639 CXX test/cpp_headers/bdev.o 00:02:25.639 CXX test/cpp_headers/bdev_module.o 00:02:25.639 CXX test/cpp_headers/bdev_zone.o 00:02:25.639 CXX test/cpp_headers/bit_array.o 00:02:25.639 LINK pci_ut 00:02:25.639 CXX test/cpp_headers/bit_pool.o 00:02:25.639 LINK dif 00:02:25.639 CXX test/cpp_headers/blob_bdev.o 00:02:25.639 CXX test/cpp_headers/blobfs_bdev.o 00:02:25.639 CXX test/cpp_headers/blobfs.o 00:02:25.639 CXX test/cpp_headers/blob.o 00:02:25.639 CXX test/cpp_headers/conf.o 00:02:25.639 CC test/nvme/fused_ordering/fused_ordering.o 00:02:25.639 LINK bdevio 00:02:25.639 LINK test_dma 00:02:25.639 CXX test/cpp_headers/config.o 00:02:25.639 CXX test/cpp_headers/cpuset.o 00:02:25.639 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:25.639 CXX test/cpp_headers/crc16.o 00:02:25.639 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:25.639 CC test/nvme/fdp/fdp.o 00:02:25.639 CXX test/cpp_headers/crc32.o 00:02:25.639 CXX test/cpp_headers/crc64.o 00:02:25.639 CXX test/cpp_headers/dif.o 00:02:25.639 LINK accel_perf 00:02:25.639 CXX test/cpp_headers/dma.o 00:02:25.639 CXX test/cpp_headers/endian.o 00:02:25.639 CXX test/cpp_headers/env_dpdk.o 00:02:25.639 CXX test/cpp_headers/env.o 00:02:25.639 CXX test/cpp_headers/event.o 00:02:25.639 CXX test/cpp_headers/fd_group.o 00:02:25.639 CXX test/cpp_headers/fd.o 00:02:25.639 CXX test/cpp_headers/file.o 00:02:25.639 CXX test/cpp_headers/ftl.o 00:02:25.639 CXX test/cpp_headers/gpt_spec.o 00:02:25.639 LINK nvme_manage 00:02:25.639 CXX test/cpp_headers/hexlify.o 00:02:25.639 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:25.639 CXX test/cpp_headers/histogram_data.o 00:02:25.639 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:25.639 CXX test/cpp_headers/idxd.o 00:02:25.639 CC test/nvme/cuse/cuse.o 00:02:25.639 LINK spdk_nvme 00:02:25.639 CXX test/cpp_headers/init.o 00:02:25.639 CXX test/cpp_headers/idxd_spec.o 00:02:25.639 CXX test/cpp_headers/ioat.o 00:02:25.899 CXX test/cpp_headers/ioat_spec.o 00:02:25.899 LINK spdk_bdev 00:02:25.899 CXX test/cpp_headers/iscsi_spec.o 00:02:25.899 CXX test/cpp_headers/json.o 00:02:25.899 CXX test/cpp_headers/jsonrpc.o 00:02:25.899 CXX test/cpp_headers/keyring.o 00:02:25.899 LINK nvme_fuzz 00:02:25.899 CXX test/cpp_headers/likely.o 00:02:25.899 CXX test/cpp_headers/keyring_module.o 00:02:25.899 CXX test/cpp_headers/log.o 00:02:25.899 CXX test/cpp_headers/lvol.o 00:02:25.899 CXX test/cpp_headers/memory.o 00:02:25.899 CXX test/cpp_headers/mmio.o 00:02:25.899 LINK blobcli 00:02:25.899 CXX test/cpp_headers/nbd.o 00:02:25.899 CXX test/cpp_headers/notify.o 00:02:25.899 CXX test/cpp_headers/nvme.o 00:02:25.899 CXX test/cpp_headers/nvme_intel.o 00:02:25.899 CXX test/cpp_headers/nvme_ocssd.o 00:02:25.899 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:25.899 CXX test/cpp_headers/nvme_spec.o 00:02:25.899 CXX test/cpp_headers/nvme_zns.o 00:02:25.899 CXX test/cpp_headers/nvmf_cmd.o 00:02:25.899 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:26.159 CXX test/cpp_headers/nvmf.o 00:02:26.159 CXX test/cpp_headers/nvmf_spec.o 00:02:26.159 CXX test/cpp_headers/nvmf_transport.o 00:02:26.159 CXX test/cpp_headers/opal.o 00:02:26.159 CXX test/cpp_headers/opal_spec.o 00:02:26.159 CXX test/cpp_headers/pci_ids.o 00:02:26.159 CXX test/cpp_headers/pipe.o 00:02:26.159 LINK fused_ordering 00:02:26.159 CXX test/cpp_headers/queue.o 00:02:26.159 LINK spdk_nvme_perf 00:02:26.159 CXX test/cpp_headers/reduce.o 00:02:26.159 CXX test/cpp_headers/rpc.o 00:02:26.159 LINK mem_callbacks 00:02:26.159 CXX test/cpp_headers/scheduler.o 00:02:26.159 CXX test/cpp_headers/scsi.o 00:02:26.159 CXX test/cpp_headers/scsi_spec.o 00:02:26.159 CXX test/cpp_headers/sock.o 00:02:26.159 CXX test/cpp_headers/stdinc.o 00:02:26.159 CXX test/cpp_headers/string.o 00:02:26.159 CXX test/cpp_headers/thread.o 00:02:26.159 CXX test/cpp_headers/trace.o 00:02:26.159 LINK doorbell_aers 00:02:26.159 CXX test/cpp_headers/trace_parser.o 00:02:26.159 CXX test/cpp_headers/tree.o 00:02:26.159 CXX test/cpp_headers/ublk.o 00:02:26.159 CXX test/cpp_headers/util.o 00:02:26.159 CXX test/cpp_headers/uuid.o 00:02:26.159 CXX test/cpp_headers/version.o 00:02:26.159 CXX test/cpp_headers/vfio_user_pci.o 00:02:26.159 CXX test/cpp_headers/vfio_user_spec.o 00:02:26.159 LINK spdk_nvme_identify 00:02:26.159 CXX test/cpp_headers/vhost.o 00:02:26.159 LINK spdk_top 00:02:26.159 CXX test/cpp_headers/vmd.o 00:02:26.159 CXX test/cpp_headers/xor.o 00:02:26.159 CXX test/cpp_headers/zipf.o 00:02:26.420 LINK memory_ut 00:02:26.420 LINK fdp 00:02:26.420 LINK bdevperf 00:02:26.678 LINK vhost_fuzz 00:02:27.245 LINK cuse 00:02:27.504 LINK iscsi_fuzz 00:02:30.793 LINK esnap 00:02:30.793 00:02:30.793 real 1m30.669s 00:02:30.793 user 17m29.747s 00:02:30.793 sys 4m24.534s 00:02:30.793 11:38:57 make -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:02:30.793 11:38:57 make -- common/autotest_common.sh@10 -- $ set +x 00:02:30.793 ************************************ 00:02:30.793 END TEST make 00:02:30.793 ************************************ 00:02:30.793 11:38:57 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:30.793 11:38:57 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:30.793 11:38:57 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:30.793 11:38:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.793 11:38:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:30.793 11:38:57 -- pm/common@44 -- $ pid=1509542 00:02:30.793 11:38:57 -- pm/common@50 -- $ kill -TERM 1509542 00:02:30.793 11:38:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.793 11:38:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:30.793 11:38:57 -- pm/common@44 -- $ pid=1509543 00:02:30.793 11:38:57 -- pm/common@50 -- $ kill -TERM 1509543 00:02:30.793 11:38:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.793 11:38:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:30.793 11:38:57 -- pm/common@44 -- $ pid=1509545 00:02:30.793 11:38:57 -- pm/common@50 -- $ kill -TERM 1509545 00:02:30.793 11:38:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:30.793 11:38:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:30.793 11:38:57 -- pm/common@44 -- $ pid=1509570 00:02:30.793 11:38:57 -- pm/common@50 -- $ sudo -E kill -TERM 1509570 00:02:31.053 11:38:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:31.053 11:38:57 -- nvmf/common.sh@7 -- # uname -s 00:02:31.053 11:38:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:31.053 11:38:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:31.053 11:38:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:31.053 11:38:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:31.053 11:38:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:31.053 11:38:58 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:31.053 11:38:58 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:31.053 11:38:58 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:31.053 11:38:58 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:31.053 11:38:58 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:31.053 11:38:58 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:02:31.053 11:38:58 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:02:31.053 11:38:58 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:31.053 11:38:58 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:31.053 11:38:58 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:31.053 11:38:58 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:31.053 11:38:58 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:31.053 11:38:58 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:31.053 11:38:58 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:31.053 11:38:58 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:31.053 11:38:58 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:31.053 11:38:58 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:31.053 11:38:58 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:31.053 11:38:58 -- paths/export.sh@5 -- # export PATH 00:02:31.053 11:38:58 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:31.053 11:38:58 -- nvmf/common.sh@47 -- # : 0 00:02:31.053 11:38:58 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:31.053 11:38:58 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:31.053 11:38:58 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:31.053 11:38:58 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:31.053 11:38:58 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:31.053 11:38:58 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:31.053 11:38:58 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:31.053 11:38:58 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:31.053 11:38:58 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:31.053 11:38:58 -- spdk/autotest.sh@32 -- # uname -s 00:02:31.053 11:38:58 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:31.053 11:38:58 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:31.053 11:38:58 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:31.053 11:38:58 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:31.053 11:38:58 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:31.053 11:38:58 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:31.053 11:38:58 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:31.053 11:38:58 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:31.053 11:38:58 -- spdk/autotest.sh@48 -- # udevadm_pid=1575625 00:02:31.053 11:38:58 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:31.053 11:38:58 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:31.053 11:38:58 -- pm/common@17 -- # local monitor 00:02:31.053 11:38:58 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.053 11:38:58 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.053 11:38:58 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.053 11:38:58 -- pm/common@21 -- # date +%s 00:02:31.053 11:38:58 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:31.053 11:38:58 -- pm/common@21 -- # date +%s 00:02:31.053 11:38:58 -- pm/common@25 -- # sleep 1 00:02:31.053 11:38:58 -- pm/common@21 -- # date +%s 00:02:31.053 11:38:58 -- pm/common@21 -- # date +%s 00:02:31.053 11:38:58 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715679538 00:02:31.053 11:38:58 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715679538 00:02:31.053 11:38:58 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715679538 00:02:31.053 11:38:58 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715679538 00:02:31.053 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715679538_collect-vmstat.pm.log 00:02:31.053 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715679538_collect-cpu-load.pm.log 00:02:31.053 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715679538_collect-cpu-temp.pm.log 00:02:31.053 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715679538_collect-bmc-pm.bmc.pm.log 00:02:31.990 11:38:59 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:31.990 11:38:59 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:31.990 11:38:59 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:31.990 11:38:59 -- common/autotest_common.sh@10 -- # set +x 00:02:31.990 11:38:59 -- spdk/autotest.sh@59 -- # create_test_list 00:02:31.990 11:38:59 -- common/autotest_common.sh@744 -- # xtrace_disable 00:02:31.990 11:38:59 -- common/autotest_common.sh@10 -- # set +x 00:02:32.248 11:38:59 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:32.248 11:38:59 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:32.248 11:38:59 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:32.248 11:38:59 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:32.248 11:38:59 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:32.248 11:38:59 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:32.248 11:38:59 -- common/autotest_common.sh@1451 -- # uname 00:02:32.248 11:38:59 -- common/autotest_common.sh@1451 -- # '[' Linux = FreeBSD ']' 00:02:32.248 11:38:59 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:32.248 11:38:59 -- common/autotest_common.sh@1471 -- # uname 00:02:32.248 11:38:59 -- common/autotest_common.sh@1471 -- # [[ Linux = FreeBSD ]] 00:02:32.248 11:38:59 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:32.248 11:38:59 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:32.248 11:38:59 -- spdk/autotest.sh@72 -- # hash lcov 00:02:32.248 11:38:59 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:32.248 11:38:59 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:32.248 --rc lcov_branch_coverage=1 00:02:32.248 --rc lcov_function_coverage=1 00:02:32.248 --rc genhtml_branch_coverage=1 00:02:32.248 --rc genhtml_function_coverage=1 00:02:32.248 --rc genhtml_legend=1 00:02:32.248 --rc geninfo_all_blocks=1 00:02:32.248 ' 00:02:32.248 11:38:59 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:32.248 --rc lcov_branch_coverage=1 00:02:32.248 --rc lcov_function_coverage=1 00:02:32.248 --rc genhtml_branch_coverage=1 00:02:32.248 --rc genhtml_function_coverage=1 00:02:32.248 --rc genhtml_legend=1 00:02:32.248 --rc geninfo_all_blocks=1 00:02:32.248 ' 00:02:32.248 11:38:59 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:32.248 --rc lcov_branch_coverage=1 00:02:32.248 --rc lcov_function_coverage=1 00:02:32.248 --rc genhtml_branch_coverage=1 00:02:32.248 --rc genhtml_function_coverage=1 00:02:32.248 --rc genhtml_legend=1 00:02:32.248 --rc geninfo_all_blocks=1 00:02:32.248 --no-external' 00:02:32.248 11:38:59 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:32.248 --rc lcov_branch_coverage=1 00:02:32.248 --rc lcov_function_coverage=1 00:02:32.248 --rc genhtml_branch_coverage=1 00:02:32.248 --rc genhtml_function_coverage=1 00:02:32.248 --rc genhtml_legend=1 00:02:32.248 --rc geninfo_all_blocks=1 00:02:32.248 --no-external' 00:02:32.248 11:38:59 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:32.248 lcov: LCOV version 1.14 00:02:32.248 11:38:59 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:42.271 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:42.271 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:43.208 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno:no functions found 00:02:43.208 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:43.208 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno:no functions found 00:02:43.208 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:43.208 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno:no functions found 00:02:43.208 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:55.421 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:55.421 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:55.422 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:55.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:55.682 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:55.682 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:55.942 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:55.942 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:02:57.321 11:39:24 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:57.321 11:39:24 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:57.321 11:39:24 -- common/autotest_common.sh@10 -- # set +x 00:02:57.321 11:39:24 -- spdk/autotest.sh@91 -- # rm -f 00:02:57.321 11:39:24 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:01.514 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:01.514 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:01.514 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:01.514 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:01.514 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:01.514 11:39:28 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:01.514 11:39:28 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:03:01.514 11:39:28 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:03:01.514 11:39:28 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:03:01.514 11:39:28 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:03:01.514 11:39:28 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:03:01.514 11:39:28 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:03:01.514 11:39:28 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:01.514 11:39:28 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:03:01.514 11:39:28 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:01.514 11:39:28 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:01.514 11:39:28 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:01.514 11:39:28 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:01.514 11:39:28 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:01.514 11:39:28 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:01.514 No valid GPT data, bailing 00:03:01.514 11:39:28 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:01.514 11:39:28 -- scripts/common.sh@391 -- # pt= 00:03:01.514 11:39:28 -- scripts/common.sh@392 -- # return 1 00:03:01.514 11:39:28 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:01.514 1+0 records in 00:03:01.514 1+0 records out 00:03:01.514 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0025841 s, 406 MB/s 00:03:01.514 11:39:28 -- spdk/autotest.sh@118 -- # sync 00:03:01.514 11:39:28 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:01.514 11:39:28 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:01.514 11:39:28 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:06.788 11:39:33 -- spdk/autotest.sh@124 -- # uname -s 00:03:06.788 11:39:33 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:06.788 11:39:33 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:06.788 11:39:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:06.788 11:39:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:06.788 11:39:33 -- common/autotest_common.sh@10 -- # set +x 00:03:06.788 ************************************ 00:03:06.788 START TEST setup.sh 00:03:06.788 ************************************ 00:03:06.788 11:39:33 setup.sh -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:06.788 * Looking for test storage... 00:03:06.788 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:06.788 11:39:33 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:06.788 11:39:33 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:06.788 11:39:33 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:06.788 11:39:33 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:06.788 11:39:33 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:06.788 11:39:33 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:06.788 ************************************ 00:03:06.788 START TEST acl 00:03:06.788 ************************************ 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:06.788 * Looking for test storage... 00:03:06.788 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1666 -- # local nvme bdf 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:06.788 11:39:33 setup.sh.acl -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:06.788 11:39:33 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:06.788 11:39:33 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:06.788 11:39:33 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:11.051 11:39:37 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:11.051 11:39:37 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:11.051 11:39:37 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:11.051 11:39:37 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:11.051 11:39:37 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:11.051 11:39:37 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 Hugepages 00:03:15.241 node hugesize free / total 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 00:03:15.241 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:15.241 11:39:41 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:15.242 11:39:41 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:15.242 11:39:41 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:15.242 11:39:41 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:15.242 11:39:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:15.242 ************************************ 00:03:15.242 START TEST denied 00:03:15.242 ************************************ 00:03:15.242 11:39:41 setup.sh.acl.denied -- common/autotest_common.sh@1121 -- # denied 00:03:15.242 11:39:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:15.242 11:39:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:15.242 11:39:41 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:15.242 11:39:41 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:15.242 11:39:41 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:19.435 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:19.436 11:39:45 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:24.705 00:03:24.705 real 0m8.898s 00:03:24.705 user 0m2.770s 00:03:24.705 sys 0m5.399s 00:03:24.705 11:39:50 setup.sh.acl.denied -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:24.705 11:39:50 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:24.705 ************************************ 00:03:24.705 END TEST denied 00:03:24.705 ************************************ 00:03:24.705 11:39:50 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:24.705 11:39:50 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:24.705 11:39:50 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:24.705 11:39:50 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:24.705 ************************************ 00:03:24.705 START TEST allowed 00:03:24.705 ************************************ 00:03:24.705 11:39:50 setup.sh.acl.allowed -- common/autotest_common.sh@1121 -- # allowed 00:03:24.705 11:39:50 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:24.705 11:39:50 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:24.705 11:39:50 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:24.705 11:39:50 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:24.705 11:39:50 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:31.266 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:31.266 11:39:57 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:31.266 11:39:57 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:31.266 11:39:57 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:31.266 11:39:57 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:31.266 11:39:57 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.550 00:03:34.550 real 0m10.553s 00:03:34.550 user 0m2.727s 00:03:34.550 sys 0m5.342s 00:03:34.550 11:40:01 setup.sh.acl.allowed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:34.550 11:40:01 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:34.550 ************************************ 00:03:34.550 END TEST allowed 00:03:34.550 ************************************ 00:03:34.550 00:03:34.550 real 0m27.862s 00:03:34.550 user 0m8.574s 00:03:34.550 sys 0m16.365s 00:03:34.550 11:40:01 setup.sh.acl -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:34.550 11:40:01 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:34.550 ************************************ 00:03:34.550 END TEST acl 00:03:34.550 ************************************ 00:03:34.550 11:40:01 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:34.550 11:40:01 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:34.550 11:40:01 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:34.550 11:40:01 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:34.550 ************************************ 00:03:34.550 START TEST hugepages 00:03:34.550 ************************************ 00:03:34.550 11:40:01 setup.sh.hugepages -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:34.811 * Looking for test storage... 00:03:34.811 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 75769396 kB' 'MemAvailable: 80208068 kB' 'Buffers: 14216 kB' 'Cached: 10423392 kB' 'SwapCached: 0 kB' 'Active: 6511704 kB' 'Inactive: 4409864 kB' 'Active(anon): 5942372 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 487820 kB' 'Mapped: 184740 kB' 'Shmem: 5458412 kB' 'KReclaimable: 225788 kB' 'Slab: 545536 kB' 'SReclaimable: 225788 kB' 'SUnreclaim: 319748 kB' 'KernelStack: 16016 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438196 kB' 'Committed_AS: 7241560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.811 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.812 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:34.813 11:40:01 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:34.813 11:40:01 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:34.813 11:40:01 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:34.813 11:40:01 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:34.813 ************************************ 00:03:34.813 START TEST default_setup 00:03:34.813 ************************************ 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1121 -- # default_setup 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.813 11:40:01 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:38.101 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:38.101 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:38.360 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:38.360 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:40.932 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77931080 kB' 'MemAvailable: 82369704 kB' 'Buffers: 14216 kB' 'Cached: 10423512 kB' 'SwapCached: 0 kB' 'Active: 6530084 kB' 'Inactive: 4409864 kB' 'Active(anon): 5960752 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505540 kB' 'Mapped: 184928 kB' 'Shmem: 5458532 kB' 'KReclaimable: 225692 kB' 'Slab: 544764 kB' 'SReclaimable: 225692 kB' 'SUnreclaim: 319072 kB' 'KernelStack: 16448 kB' 'PageTables: 9012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7260304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:40.932 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.198 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77933704 kB' 'MemAvailable: 82372328 kB' 'Buffers: 14216 kB' 'Cached: 10423512 kB' 'SwapCached: 0 kB' 'Active: 6530504 kB' 'Inactive: 4409864 kB' 'Active(anon): 5961172 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 506028 kB' 'Mapped: 184928 kB' 'Shmem: 5458532 kB' 'KReclaimable: 225692 kB' 'Slab: 544788 kB' 'SReclaimable: 225692 kB' 'SUnreclaim: 319096 kB' 'KernelStack: 16480 kB' 'PageTables: 8936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7260324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200776 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.199 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.200 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77933116 kB' 'MemAvailable: 82371740 kB' 'Buffers: 14216 kB' 'Cached: 10423516 kB' 'SwapCached: 0 kB' 'Active: 6529368 kB' 'Inactive: 4409864 kB' 'Active(anon): 5960036 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504764 kB' 'Mapped: 184876 kB' 'Shmem: 5458536 kB' 'KReclaimable: 225692 kB' 'Slab: 544776 kB' 'SReclaimable: 225692 kB' 'SUnreclaim: 319084 kB' 'KernelStack: 16048 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7260344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.201 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.202 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:41.203 nr_hugepages=1024 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:41.203 resv_hugepages=0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:41.203 surplus_hugepages=0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:41.203 anon_hugepages=0 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77931436 kB' 'MemAvailable: 82370060 kB' 'Buffers: 14216 kB' 'Cached: 10423556 kB' 'SwapCached: 0 kB' 'Active: 6528796 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959464 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504112 kB' 'Mapped: 184872 kB' 'Shmem: 5458576 kB' 'KReclaimable: 225692 kB' 'Slab: 544776 kB' 'SReclaimable: 225692 kB' 'SUnreclaim: 319084 kB' 'KernelStack: 16032 kB' 'PageTables: 7828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7260368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.203 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.204 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35817076 kB' 'MemUsed: 12252812 kB' 'SwapCached: 0 kB' 'Active: 5174808 kB' 'Inactive: 4207904 kB' 'Active(anon): 4705156 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127104 kB' 'Mapped: 144336 kB' 'AnonPages: 258728 kB' 'Shmem: 4449548 kB' 'KernelStack: 9800 kB' 'PageTables: 5720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147840 kB' 'Slab: 333336 kB' 'SReclaimable: 147840 kB' 'SUnreclaim: 185496 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.205 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:41.206 node0=1024 expecting 1024 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:41.206 00:03:41.206 real 0m6.398s 00:03:41.206 user 0m1.386s 00:03:41.206 sys 0m2.570s 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:41.206 11:40:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:41.206 ************************************ 00:03:41.206 END TEST default_setup 00:03:41.206 ************************************ 00:03:41.206 11:40:08 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:41.206 11:40:08 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:41.206 11:40:08 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:41.206 11:40:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:41.206 ************************************ 00:03:41.206 START TEST per_node_1G_alloc 00:03:41.206 ************************************ 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1121 -- # per_node_1G_alloc 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.206 11:40:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:45.414 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:45.414 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:45.414 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:03:45.414 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:45.414 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77934992 kB' 'MemAvailable: 82373568 kB' 'Buffers: 14216 kB' 'Cached: 10423644 kB' 'SwapCached: 0 kB' 'Active: 6527704 kB' 'Inactive: 4409864 kB' 'Active(anon): 5958372 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502568 kB' 'Mapped: 183916 kB' 'Shmem: 5458664 kB' 'KReclaimable: 225596 kB' 'Slab: 544388 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318792 kB' 'KernelStack: 16032 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7251332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.414 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77934336 kB' 'MemAvailable: 82372912 kB' 'Buffers: 14216 kB' 'Cached: 10423648 kB' 'SwapCached: 0 kB' 'Active: 6527356 kB' 'Inactive: 4409864 kB' 'Active(anon): 5958024 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502684 kB' 'Mapped: 183852 kB' 'Shmem: 5458668 kB' 'KReclaimable: 225596 kB' 'Slab: 544360 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318764 kB' 'KernelStack: 15952 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7252744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.415 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.416 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77937648 kB' 'MemAvailable: 82376224 kB' 'Buffers: 14216 kB' 'Cached: 10423664 kB' 'SwapCached: 0 kB' 'Active: 6527472 kB' 'Inactive: 4409864 kB' 'Active(anon): 5958140 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502756 kB' 'Mapped: 183852 kB' 'Shmem: 5458684 kB' 'KReclaimable: 225596 kB' 'Slab: 544360 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318764 kB' 'KernelStack: 16080 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7251380 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.417 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:45.418 nr_hugepages=1024 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:45.418 resv_hugepages=0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:45.418 surplus_hugepages=0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:45.418 anon_hugepages=0 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77932220 kB' 'MemAvailable: 82370796 kB' 'Buffers: 14216 kB' 'Cached: 10423688 kB' 'SwapCached: 0 kB' 'Active: 6530988 kB' 'Inactive: 4409864 kB' 'Active(anon): 5961656 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 506160 kB' 'Mapped: 184356 kB' 'Shmem: 5458708 kB' 'KReclaimable: 225596 kB' 'Slab: 544360 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318764 kB' 'KernelStack: 16080 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7257312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.418 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.419 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36860080 kB' 'MemUsed: 11209808 kB' 'SwapCached: 0 kB' 'Active: 5174740 kB' 'Inactive: 4207904 kB' 'Active(anon): 4705088 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127124 kB' 'Mapped: 143316 kB' 'AnonPages: 258660 kB' 'Shmem: 4449568 kB' 'KernelStack: 9896 kB' 'PageTables: 5704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332884 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 185140 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.420 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41068084 kB' 'MemUsed: 3155516 kB' 'SwapCached: 0 kB' 'Active: 1352836 kB' 'Inactive: 201960 kB' 'Active(anon): 1253156 kB' 'Inactive(anon): 0 kB' 'Active(file): 99680 kB' 'Inactive(file): 201960 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1310804 kB' 'Mapped: 40536 kB' 'AnonPages: 244064 kB' 'Shmem: 1009164 kB' 'KernelStack: 6248 kB' 'PageTables: 2424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 77852 kB' 'Slab: 211476 kB' 'SReclaimable: 77852 kB' 'SUnreclaim: 133624 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.421 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:45.422 node0=512 expecting 512 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:45.422 node1=512 expecting 512 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:45.422 00:03:45.422 real 0m3.932s 00:03:45.422 user 0m1.548s 00:03:45.422 sys 0m2.484s 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:45.422 11:40:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:45.422 ************************************ 00:03:45.422 END TEST per_node_1G_alloc 00:03:45.422 ************************************ 00:03:45.422 11:40:12 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:45.422 11:40:12 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:45.422 11:40:12 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:45.422 11:40:12 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:45.422 ************************************ 00:03:45.422 START TEST even_2G_alloc 00:03:45.422 ************************************ 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1121 -- # even_2G_alloc 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.422 11:40:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:48.707 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:48.707 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:48.707 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:03:48.707 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.707 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77934164 kB' 'MemAvailable: 82372740 kB' 'Buffers: 14216 kB' 'Cached: 10423796 kB' 'SwapCached: 0 kB' 'Active: 6527488 kB' 'Inactive: 4409864 kB' 'Active(anon): 5958156 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502476 kB' 'Mapped: 183904 kB' 'Shmem: 5458816 kB' 'KReclaimable: 225596 kB' 'Slab: 543840 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318244 kB' 'KernelStack: 15968 kB' 'PageTables: 7704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7250660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.707 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.973 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77933912 kB' 'MemAvailable: 82372488 kB' 'Buffers: 14216 kB' 'Cached: 10423800 kB' 'SwapCached: 0 kB' 'Active: 6527132 kB' 'Inactive: 4409864 kB' 'Active(anon): 5957800 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 501792 kB' 'Mapped: 183844 kB' 'Shmem: 5458820 kB' 'KReclaimable: 225596 kB' 'Slab: 543896 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318300 kB' 'KernelStack: 15936 kB' 'PageTables: 7608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7250920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.974 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77933924 kB' 'MemAvailable: 82372500 kB' 'Buffers: 14216 kB' 'Cached: 10423828 kB' 'SwapCached: 0 kB' 'Active: 6527592 kB' 'Inactive: 4409864 kB' 'Active(anon): 5958260 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502692 kB' 'Mapped: 183844 kB' 'Shmem: 5458848 kB' 'KReclaimable: 225596 kB' 'Slab: 543908 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318312 kB' 'KernelStack: 16000 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7251200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.975 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.976 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:48.977 nr_hugepages=1024 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:48.977 resv_hugepages=0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:48.977 surplus_hugepages=0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:48.977 anon_hugepages=0 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77934364 kB' 'MemAvailable: 82372940 kB' 'Buffers: 14216 kB' 'Cached: 10423868 kB' 'SwapCached: 0 kB' 'Active: 6527276 kB' 'Inactive: 4409864 kB' 'Active(anon): 5957944 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 502328 kB' 'Mapped: 183844 kB' 'Shmem: 5458888 kB' 'KReclaimable: 225596 kB' 'Slab: 543908 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318312 kB' 'KernelStack: 15984 kB' 'PageTables: 7776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7251224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.977 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36860320 kB' 'MemUsed: 11209568 kB' 'SwapCached: 0 kB' 'Active: 5173340 kB' 'Inactive: 4207904 kB' 'Active(anon): 4703688 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127212 kB' 'Mapped: 143324 kB' 'AnonPages: 257136 kB' 'Shmem: 4449656 kB' 'KernelStack: 9672 kB' 'PageTables: 5176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332576 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 184832 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.978 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41074928 kB' 'MemUsed: 3148672 kB' 'SwapCached: 0 kB' 'Active: 1353952 kB' 'Inactive: 201960 kB' 'Active(anon): 1254272 kB' 'Inactive(anon): 0 kB' 'Active(file): 99680 kB' 'Inactive(file): 201960 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1310896 kB' 'Mapped: 40520 kB' 'AnonPages: 245184 kB' 'Shmem: 1009256 kB' 'KernelStack: 6312 kB' 'PageTables: 2600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 77852 kB' 'Slab: 211332 kB' 'SReclaimable: 77852 kB' 'SUnreclaim: 133480 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.979 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:48.980 node0=512 expecting 512 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:48.980 node1=512 expecting 512 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:48.980 00:03:48.980 real 0m3.711s 00:03:48.980 user 0m1.395s 00:03:48.980 sys 0m2.395s 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:48.980 11:40:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:48.980 ************************************ 00:03:48.980 END TEST even_2G_alloc 00:03:48.980 ************************************ 00:03:48.980 11:40:16 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:48.980 11:40:16 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:48.980 11:40:16 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:48.980 11:40:16 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:49.239 ************************************ 00:03:49.239 START TEST odd_alloc 00:03:49.239 ************************************ 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1121 -- # odd_alloc 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:49.239 11:40:16 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:52.526 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:52.526 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:52.526 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:03:52.526 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:52.526 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77954100 kB' 'MemAvailable: 82392676 kB' 'Buffers: 14216 kB' 'Cached: 10423956 kB' 'SwapCached: 0 kB' 'Active: 6533572 kB' 'Inactive: 4409864 kB' 'Active(anon): 5964240 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508980 kB' 'Mapped: 184376 kB' 'Shmem: 5458976 kB' 'KReclaimable: 225596 kB' 'Slab: 543832 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318236 kB' 'KernelStack: 16064 kB' 'PageTables: 8012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7256756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.791 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.792 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77950584 kB' 'MemAvailable: 82389160 kB' 'Buffers: 14216 kB' 'Cached: 10423960 kB' 'SwapCached: 0 kB' 'Active: 6529304 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959972 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504248 kB' 'Mapped: 184204 kB' 'Shmem: 5458980 kB' 'KReclaimable: 225596 kB' 'Slab: 543808 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318212 kB' 'KernelStack: 16048 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7251720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.793 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.794 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77951088 kB' 'MemAvailable: 82389664 kB' 'Buffers: 14216 kB' 'Cached: 10423960 kB' 'SwapCached: 0 kB' 'Active: 6529172 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959840 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504160 kB' 'Mapped: 183856 kB' 'Shmem: 5458980 kB' 'KReclaimable: 225596 kB' 'Slab: 543808 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318212 kB' 'KernelStack: 16048 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7251740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.795 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.796 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:52.797 nr_hugepages=1025 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:52.797 resv_hugepages=0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:52.797 surplus_hugepages=0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:52.797 anon_hugepages=0 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77951488 kB' 'MemAvailable: 82390064 kB' 'Buffers: 14216 kB' 'Cached: 10424016 kB' 'SwapCached: 0 kB' 'Active: 6528872 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959540 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503764 kB' 'Mapped: 183856 kB' 'Shmem: 5459036 kB' 'KReclaimable: 225596 kB' 'Slab: 543808 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318212 kB' 'KernelStack: 16032 kB' 'PageTables: 7904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7251760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.797 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.798 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36874564 kB' 'MemUsed: 11195324 kB' 'SwapCached: 0 kB' 'Active: 5174276 kB' 'Inactive: 4207904 kB' 'Active(anon): 4704624 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127248 kB' 'Mapped: 143336 kB' 'AnonPages: 258028 kB' 'Shmem: 4449692 kB' 'KernelStack: 9720 kB' 'PageTables: 5308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332428 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 184684 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.799 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41076168 kB' 'MemUsed: 3147432 kB' 'SwapCached: 0 kB' 'Active: 1354984 kB' 'Inactive: 201960 kB' 'Active(anon): 1255304 kB' 'Inactive(anon): 0 kB' 'Active(file): 99680 kB' 'Inactive(file): 201960 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1311008 kB' 'Mapped: 40520 kB' 'AnonPages: 246132 kB' 'Shmem: 1009368 kB' 'KernelStack: 6328 kB' 'PageTables: 2648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 77852 kB' 'Slab: 211380 kB' 'SReclaimable: 77852 kB' 'SUnreclaim: 133528 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.800 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.801 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:52.802 node0=512 expecting 513 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:52.802 node1=513 expecting 512 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:52.802 00:03:52.802 real 0m3.776s 00:03:52.802 user 0m1.400s 00:03:52.802 sys 0m2.461s 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:52.802 11:40:19 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:52.802 ************************************ 00:03:52.802 END TEST odd_alloc 00:03:52.802 ************************************ 00:03:53.062 11:40:19 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:53.062 11:40:19 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:53.062 11:40:19 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:53.062 11:40:19 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:53.062 ************************************ 00:03:53.062 START TEST custom_alloc 00:03:53.062 ************************************ 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1121 -- # custom_alloc 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.062 11:40:19 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:57.262 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:57.262 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:57.262 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:03:57.262 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.262 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76896880 kB' 'MemAvailable: 81335456 kB' 'Buffers: 14216 kB' 'Cached: 10424104 kB' 'SwapCached: 0 kB' 'Active: 6529156 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959824 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503896 kB' 'Mapped: 183868 kB' 'Shmem: 5459124 kB' 'KReclaimable: 225596 kB' 'Slab: 544076 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318480 kB' 'KernelStack: 15984 kB' 'PageTables: 7768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7252240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200808 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.262 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.263 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76896432 kB' 'MemAvailable: 81335008 kB' 'Buffers: 14216 kB' 'Cached: 10424124 kB' 'SwapCached: 0 kB' 'Active: 6529040 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959708 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503752 kB' 'Mapped: 183868 kB' 'Shmem: 5459144 kB' 'KReclaimable: 225596 kB' 'Slab: 544108 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318512 kB' 'KernelStack: 15968 kB' 'PageTables: 7736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7262836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.264 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.265 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76896480 kB' 'MemAvailable: 81335056 kB' 'Buffers: 14216 kB' 'Cached: 10424124 kB' 'SwapCached: 0 kB' 'Active: 6529288 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959956 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504036 kB' 'Mapped: 183868 kB' 'Shmem: 5459144 kB' 'KReclaimable: 225596 kB' 'Slab: 544108 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318512 kB' 'KernelStack: 15952 kB' 'PageTables: 7660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7251916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.266 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:57.267 nr_hugepages=1536 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:57.267 resv_hugepages=0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:57.267 surplus_hugepages=0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:57.267 anon_hugepages=0 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:57.267 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 76896100 kB' 'MemAvailable: 81334676 kB' 'Buffers: 14216 kB' 'Cached: 10424148 kB' 'SwapCached: 0 kB' 'Active: 6529152 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959820 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503868 kB' 'Mapped: 183868 kB' 'Shmem: 5459168 kB' 'KReclaimable: 225596 kB' 'Slab: 544108 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318512 kB' 'KernelStack: 15952 kB' 'PageTables: 7660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7252068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.268 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.269 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 36866604 kB' 'MemUsed: 11203284 kB' 'SwapCached: 0 kB' 'Active: 5174520 kB' 'Inactive: 4207904 kB' 'Active(anon): 4704868 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127276 kB' 'Mapped: 143348 kB' 'AnonPages: 258260 kB' 'Shmem: 4449720 kB' 'KernelStack: 9656 kB' 'PageTables: 5092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332420 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 184676 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.270 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 40029320 kB' 'MemUsed: 4194280 kB' 'SwapCached: 0 kB' 'Active: 1354656 kB' 'Inactive: 201960 kB' 'Active(anon): 1254976 kB' 'Inactive(anon): 0 kB' 'Active(file): 99680 kB' 'Inactive(file): 201960 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1311128 kB' 'Mapped: 40520 kB' 'AnonPages: 245612 kB' 'Shmem: 1009488 kB' 'KernelStack: 6296 kB' 'PageTables: 2568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 77852 kB' 'Slab: 211688 kB' 'SReclaimable: 77852 kB' 'SUnreclaim: 133836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.271 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:57.272 node0=512 expecting 512 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:57.272 node1=1024 expecting 1024 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:57.272 00:03:57.272 real 0m3.921s 00:03:57.272 user 0m1.546s 00:03:57.272 sys 0m2.466s 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:57.272 11:40:23 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:57.272 ************************************ 00:03:57.272 END TEST custom_alloc 00:03:57.272 ************************************ 00:03:57.272 11:40:23 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:57.272 11:40:23 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:57.272 11:40:23 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:57.272 11:40:23 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:57.272 ************************************ 00:03:57.272 START TEST no_shrink_alloc 00:03:57.272 ************************************ 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1121 -- # no_shrink_alloc 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:57.272 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.273 11:40:23 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:00.578 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:00.578 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:00.578 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:00.578 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.578 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:00.578 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77925028 kB' 'MemAvailable: 82363604 kB' 'Buffers: 14216 kB' 'Cached: 10424264 kB' 'SwapCached: 0 kB' 'Active: 6532020 kB' 'Inactive: 4409864 kB' 'Active(anon): 5962688 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 506112 kB' 'Mapped: 183900 kB' 'Shmem: 5459284 kB' 'KReclaimable: 225596 kB' 'Slab: 544480 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318884 kB' 'KernelStack: 16048 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7252920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.843 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77925112 kB' 'MemAvailable: 82363688 kB' 'Buffers: 14216 kB' 'Cached: 10424268 kB' 'SwapCached: 0 kB' 'Active: 6531748 kB' 'Inactive: 4409864 kB' 'Active(anon): 5962416 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505884 kB' 'Mapped: 183896 kB' 'Shmem: 5459288 kB' 'KReclaimable: 225596 kB' 'Slab: 544480 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318884 kB' 'KernelStack: 16080 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7252940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.844 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.845 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77925360 kB' 'MemAvailable: 82363936 kB' 'Buffers: 14216 kB' 'Cached: 10424268 kB' 'SwapCached: 0 kB' 'Active: 6530952 kB' 'Inactive: 4409864 kB' 'Active(anon): 5961620 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505544 kB' 'Mapped: 183820 kB' 'Shmem: 5459288 kB' 'KReclaimable: 225596 kB' 'Slab: 544448 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318852 kB' 'KernelStack: 16080 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7252960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.846 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.847 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:00.848 nr_hugepages=1024 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:00.848 resv_hugepages=0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:00.848 surplus_hugepages=0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:00.848 anon_hugepages=0 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.848 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77925108 kB' 'MemAvailable: 82363684 kB' 'Buffers: 14216 kB' 'Cached: 10424324 kB' 'SwapCached: 0 kB' 'Active: 6530944 kB' 'Inactive: 4409864 kB' 'Active(anon): 5961612 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505460 kB' 'Mapped: 183820 kB' 'Shmem: 5459344 kB' 'KReclaimable: 225596 kB' 'Slab: 544448 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 318852 kB' 'KernelStack: 16064 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7252984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.849 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35797888 kB' 'MemUsed: 12272000 kB' 'SwapCached: 0 kB' 'Active: 5175624 kB' 'Inactive: 4207904 kB' 'Active(anon): 4705972 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127380 kB' 'Mapped: 143300 kB' 'AnonPages: 259264 kB' 'Shmem: 4449824 kB' 'KernelStack: 9784 kB' 'PageTables: 5564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332508 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 184764 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.850 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.851 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:00.852 node0=1024 expecting 1024 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.852 11:40:27 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:05.046 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:05.046 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:05.046 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:05.046 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.046 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.047 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.047 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77903468 kB' 'MemAvailable: 82342044 kB' 'Buffers: 14216 kB' 'Cached: 10424392 kB' 'SwapCached: 0 kB' 'Active: 6529804 kB' 'Inactive: 4409864 kB' 'Active(anon): 5960472 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503956 kB' 'Mapped: 183972 kB' 'Shmem: 5459412 kB' 'KReclaimable: 225596 kB' 'Slab: 544644 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 319048 kB' 'KernelStack: 16016 kB' 'PageTables: 7944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7254328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.047 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77904508 kB' 'MemAvailable: 82343084 kB' 'Buffers: 14216 kB' 'Cached: 10424392 kB' 'SwapCached: 0 kB' 'Active: 6532592 kB' 'Inactive: 4409864 kB' 'Active(anon): 5963260 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 507152 kB' 'Mapped: 184388 kB' 'Shmem: 5459412 kB' 'KReclaimable: 225596 kB' 'Slab: 544600 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 319004 kB' 'KernelStack: 15984 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7256924 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.048 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.049 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77900476 kB' 'MemAvailable: 82339052 kB' 'Buffers: 14216 kB' 'Cached: 10424412 kB' 'SwapCached: 0 kB' 'Active: 6529696 kB' 'Inactive: 4409864 kB' 'Active(anon): 5960364 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504192 kB' 'Mapped: 184216 kB' 'Shmem: 5459432 kB' 'KReclaimable: 225596 kB' 'Slab: 544600 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 319004 kB' 'KernelStack: 15984 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7253348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.050 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.051 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:05.052 nr_hugepages=1024 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.052 resv_hugepages=0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.052 surplus_hugepages=0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.052 anon_hugepages=0 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.052 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293488 kB' 'MemFree: 77900476 kB' 'MemAvailable: 82339052 kB' 'Buffers: 14216 kB' 'Cached: 10424452 kB' 'SwapCached: 0 kB' 'Active: 6529288 kB' 'Inactive: 4409864 kB' 'Active(anon): 5959956 kB' 'Inactive(anon): 0 kB' 'Active(file): 569332 kB' 'Inactive(file): 4409864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503720 kB' 'Mapped: 183884 kB' 'Shmem: 5459472 kB' 'KReclaimable: 225596 kB' 'Slab: 544600 kB' 'SReclaimable: 225596 kB' 'SUnreclaim: 319004 kB' 'KernelStack: 15968 kB' 'PageTables: 7776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7253368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 49920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 824744 kB' 'DirectMap2M: 15628288 kB' 'DirectMap1G: 84934656 kB' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.053 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.054 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069888 kB' 'MemFree: 35784216 kB' 'MemUsed: 12285672 kB' 'SwapCached: 0 kB' 'Active: 5176172 kB' 'Inactive: 4207904 kB' 'Active(anon): 4706520 kB' 'Inactive(anon): 0 kB' 'Active(file): 469652 kB' 'Inactive(file): 4207904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9127500 kB' 'Mapped: 143364 kB' 'AnonPages: 259848 kB' 'Shmem: 4449944 kB' 'KernelStack: 9704 kB' 'PageTables: 5336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 147744 kB' 'Slab: 332644 kB' 'SReclaimable: 147744 kB' 'SUnreclaim: 184900 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.055 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:05.056 node0=1024 expecting 1024 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:05.056 00:04:05.056 real 0m7.793s 00:04:05.056 user 0m2.896s 00:04:05.056 sys 0m5.083s 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:05.056 11:40:31 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:05.056 ************************************ 00:04:05.056 END TEST no_shrink_alloc 00:04:05.056 ************************************ 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:05.056 11:40:31 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:05.056 00:04:05.056 real 0m30.282s 00:04:05.056 user 0m10.446s 00:04:05.056 sys 0m17.963s 00:04:05.056 11:40:31 setup.sh.hugepages -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:05.056 11:40:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:05.056 ************************************ 00:04:05.056 END TEST hugepages 00:04:05.056 ************************************ 00:04:05.056 11:40:31 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:05.056 11:40:31 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:05.056 11:40:31 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:05.056 11:40:31 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:05.056 ************************************ 00:04:05.056 START TEST driver 00:04:05.056 ************************************ 00:04:05.056 11:40:31 setup.sh.driver -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:05.056 * Looking for test storage... 00:04:05.056 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:05.056 11:40:32 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:05.056 11:40:32 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:05.056 11:40:32 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:10.328 11:40:37 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:10.328 11:40:37 setup.sh.driver -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:10.328 11:40:37 setup.sh.driver -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:10.328 11:40:37 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:10.328 ************************************ 00:04:10.328 START TEST guess_driver 00:04:10.328 ************************************ 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@1121 -- # guess_driver 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:10.328 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:10.328 Looking for driver=vfio-pci 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:10.328 11:40:37 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:13.617 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:13.617 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:13.617 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.617 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:13.617 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.618 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.877 11:40:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:16.426 11:40:43 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.699 00:04:21.699 real 0m11.357s 00:04:21.699 user 0m2.710s 00:04:21.699 sys 0m5.442s 00:04:21.699 11:40:48 setup.sh.driver.guess_driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:21.699 11:40:48 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:21.699 ************************************ 00:04:21.699 END TEST guess_driver 00:04:21.699 ************************************ 00:04:21.699 00:04:21.699 real 0m16.715s 00:04:21.699 user 0m4.279s 00:04:21.699 sys 0m8.458s 00:04:21.699 11:40:48 setup.sh.driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:21.699 11:40:48 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:21.699 ************************************ 00:04:21.699 END TEST driver 00:04:21.699 ************************************ 00:04:21.699 11:40:48 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:21.699 11:40:48 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:21.699 11:40:48 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:21.699 11:40:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:21.699 ************************************ 00:04:21.699 START TEST devices 00:04:21.699 ************************************ 00:04:21.699 11:40:48 setup.sh.devices -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:21.958 * Looking for test storage... 00:04:21.958 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:21.958 11:40:48 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:21.958 11:40:48 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:21.958 11:40:48 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:21.958 11:40:48 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1666 -- # local nvme bdf 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:26.150 11:40:52 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:26.150 No valid GPT data, bailing 00:04:26.150 11:40:52 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:26.150 11:40:52 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:26.150 11:40:52 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:26.150 11:40:52 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:26.150 11:40:52 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:26.150 ************************************ 00:04:26.150 START TEST nvme_mount 00:04:26.150 ************************************ 00:04:26.150 11:40:52 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1121 -- # nvme_mount 00:04:26.150 11:40:52 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:26.150 11:40:52 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:26.151 11:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:27.088 Creating new GPT entries in memory. 00:04:27.088 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:27.088 other utilities. 00:04:27.088 11:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:27.088 11:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:27.088 11:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:27.088 11:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:27.088 11:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:28.056 Creating new GPT entries in memory. 00:04:28.056 The operation has completed successfully. 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1607871 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.056 11:40:54 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.346 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:31.347 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:31.347 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:31.605 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:31.605 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:04:31.605 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:31.605 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:31.605 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:31.605 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:31.605 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.605 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:31.605 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.864 11:40:58 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.151 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.152 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:35.418 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:35.419 11:41:02 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:39.615 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:39.615 00:04:39.615 real 0m13.299s 00:04:39.615 user 0m3.904s 00:04:39.615 sys 0m7.361s 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:39.615 11:41:06 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:39.615 ************************************ 00:04:39.615 END TEST nvme_mount 00:04:39.616 ************************************ 00:04:39.616 11:41:06 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:39.616 11:41:06 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:39.616 11:41:06 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:39.616 11:41:06 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:39.616 ************************************ 00:04:39.616 START TEST dm_mount 00:04:39.616 ************************************ 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- common/autotest_common.sh@1121 -- # dm_mount 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:39.616 11:41:06 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:40.553 Creating new GPT entries in memory. 00:04:40.553 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:40.553 other utilities. 00:04:40.553 11:41:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:40.553 11:41:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:40.553 11:41:07 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:40.553 11:41:07 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:40.553 11:41:07 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:41.491 Creating new GPT entries in memory. 00:04:41.491 The operation has completed successfully. 00:04:41.491 11:41:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:41.491 11:41:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.491 11:41:08 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:41.491 11:41:08 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:41.491 11:41:08 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:42.428 The operation has completed successfully. 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1612133 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:42.428 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.429 11:41:09 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.618 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.619 11:41:13 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.929 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:49.930 11:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:50.217 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:50.217 00:04:50.217 real 0m10.770s 00:04:50.217 user 0m2.764s 00:04:50.217 sys 0m5.140s 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:50.217 11:41:17 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:50.217 ************************************ 00:04:50.217 END TEST dm_mount 00:04:50.217 ************************************ 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:50.217 11:41:17 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:50.476 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:50.476 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:04:50.476 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:50.476 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:50.476 11:41:17 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:50.477 11:41:17 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:50.477 00:04:50.477 real 0m28.655s 00:04:50.477 user 0m8.224s 00:04:50.477 sys 0m15.433s 00:04:50.477 11:41:17 setup.sh.devices -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:50.477 11:41:17 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:50.477 ************************************ 00:04:50.477 END TEST devices 00:04:50.477 ************************************ 00:04:50.477 00:04:50.477 real 1m43.989s 00:04:50.477 user 0m31.676s 00:04:50.477 sys 0m58.565s 00:04:50.477 11:41:17 setup.sh -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:50.477 11:41:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:50.477 ************************************ 00:04:50.477 END TEST setup.sh 00:04:50.477 ************************************ 00:04:50.477 11:41:17 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:54.668 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:54.668 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:54.668 Hugepages 00:04:54.668 node hugesize free / total 00:04:54.668 node0 1048576kB 0 / 0 00:04:54.668 node0 2048kB 1024 / 1024 00:04:54.668 node1 1048576kB 0 / 0 00:04:54.669 node1 2048kB 1024 / 1024 00:04:54.669 00:04:54.669 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:54.669 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:54.669 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:54.669 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:04:54.669 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:54.669 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:54.669 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:04:54.669 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:04:54.669 11:41:21 -- spdk/autotest.sh@130 -- # uname -s 00:04:54.669 11:41:21 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:54.669 11:41:21 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:54.669 11:41:21 -- common/autotest_common.sh@1527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:57.957 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:57.957 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:57.957 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:57.957 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:58.216 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:58.216 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:00.751 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:00.751 11:41:27 -- common/autotest_common.sh@1528 -- # sleep 1 00:05:01.688 11:41:28 -- common/autotest_common.sh@1529 -- # bdfs=() 00:05:01.688 11:41:28 -- common/autotest_common.sh@1529 -- # local bdfs 00:05:01.688 11:41:28 -- common/autotest_common.sh@1530 -- # bdfs=($(get_nvme_bdfs)) 00:05:01.688 11:41:28 -- common/autotest_common.sh@1530 -- # get_nvme_bdfs 00:05:01.688 11:41:28 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:01.688 11:41:28 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:01.688 11:41:28 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:01.688 11:41:28 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:01.688 11:41:28 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:01.948 11:41:28 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:01.948 11:41:28 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:5e:00.0 00:05:01.948 11:41:28 -- common/autotest_common.sh@1532 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:05.237 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:05.237 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:05.237 Waiting for block devices as requested 00:05:05.496 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:05.496 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:05.754 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:05.754 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:05.754 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:06.014 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:06.014 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:06.014 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:06.272 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:06.272 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:06.272 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:06.531 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:06.531 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:06.531 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:06.791 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:06.791 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:06.791 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:06.791 11:41:33 -- common/autotest_common.sh@1534 -- # for bdf in "${bdfs[@]}" 00:05:06.791 11:41:33 -- common/autotest_common.sh@1535 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1498 -- # readlink -f /sys/class/nvme/nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1498 -- # grep 0000:5e:00.0/nvme/nvme 00:05:06.791 11:41:33 -- common/autotest_common.sh@1498 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1499 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:06.791 11:41:33 -- common/autotest_common.sh@1503 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1503 -- # printf '%s\n' nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1535 -- # nvme_ctrlr=/dev/nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1536 -- # [[ -z /dev/nvme0 ]] 00:05:06.791 11:41:33 -- common/autotest_common.sh@1541 -- # nvme id-ctrl /dev/nvme0 00:05:06.791 11:41:33 -- common/autotest_common.sh@1541 -- # grep oacs 00:05:06.791 11:41:33 -- common/autotest_common.sh@1541 -- # cut -d: -f2 00:05:07.049 11:41:33 -- common/autotest_common.sh@1541 -- # oacs=' 0x3f' 00:05:07.049 11:41:33 -- common/autotest_common.sh@1542 -- # oacs_ns_manage=8 00:05:07.049 11:41:33 -- common/autotest_common.sh@1544 -- # [[ 8 -ne 0 ]] 00:05:07.049 11:41:33 -- common/autotest_common.sh@1550 -- # nvme id-ctrl /dev/nvme0 00:05:07.049 11:41:33 -- common/autotest_common.sh@1550 -- # grep unvmcap 00:05:07.049 11:41:33 -- common/autotest_common.sh@1550 -- # cut -d: -f2 00:05:07.049 11:41:33 -- common/autotest_common.sh@1550 -- # unvmcap=' 0' 00:05:07.049 11:41:33 -- common/autotest_common.sh@1551 -- # [[ 0 -eq 0 ]] 00:05:07.049 11:41:33 -- common/autotest_common.sh@1553 -- # continue 00:05:07.049 11:41:33 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:07.049 11:41:33 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:07.049 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:07.049 11:41:33 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:07.049 11:41:33 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:07.049 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:07.049 11:41:33 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:11.239 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:11.240 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:11.240 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:11.240 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:13.175 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:13.175 11:41:40 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:13.175 11:41:40 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:13.175 11:41:40 -- common/autotest_common.sh@10 -- # set +x 00:05:13.175 11:41:40 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:13.175 11:41:40 -- common/autotest_common.sh@1587 -- # mapfile -t bdfs 00:05:13.175 11:41:40 -- common/autotest_common.sh@1587 -- # get_nvme_bdfs_by_id 0x0a54 00:05:13.175 11:41:40 -- common/autotest_common.sh@1573 -- # bdfs=() 00:05:13.175 11:41:40 -- common/autotest_common.sh@1573 -- # local bdfs 00:05:13.175 11:41:40 -- common/autotest_common.sh@1575 -- # get_nvme_bdfs 00:05:13.175 11:41:40 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:13.175 11:41:40 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:13.175 11:41:40 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:13.175 11:41:40 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:13.175 11:41:40 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:13.434 11:41:40 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:13.434 11:41:40 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:5e:00.0 00:05:13.434 11:41:40 -- common/autotest_common.sh@1575 -- # for bdf in $(get_nvme_bdfs) 00:05:13.434 11:41:40 -- common/autotest_common.sh@1576 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:13.434 11:41:40 -- common/autotest_common.sh@1576 -- # device=0x0b60 00:05:13.434 11:41:40 -- common/autotest_common.sh@1577 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:13.434 11:41:40 -- common/autotest_common.sh@1582 -- # printf '%s\n' 00:05:13.434 11:41:40 -- common/autotest_common.sh@1588 -- # [[ -z '' ]] 00:05:13.434 11:41:40 -- common/autotest_common.sh@1589 -- # return 0 00:05:13.434 11:41:40 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:13.434 11:41:40 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:13.434 11:41:40 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:13.434 11:41:40 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:13.434 11:41:40 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:14.002 Restarting all devices. 00:05:18.191 lstat() error: No such file or directory 00:05:18.191 QAT Error: No GENERAL section found 00:05:18.191 Failed to configure qat_dev0 00:05:18.191 lstat() error: No such file or directory 00:05:18.191 QAT Error: No GENERAL section found 00:05:18.191 Failed to configure qat_dev1 00:05:18.191 lstat() error: No such file or directory 00:05:18.191 QAT Error: No GENERAL section found 00:05:18.191 Failed to configure qat_dev2 00:05:18.191 enable sriov 00:05:18.191 Checking status of all devices. 00:05:18.191 There is 3 QAT acceleration device(s) in the system: 00:05:18.191 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:18.191 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:18.191 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:18.191 0000:3d:00.0 set to 16 VFs 00:05:19.128 0000:3f:00.0 set to 16 VFs 00:05:19.695 0000:da:00.0 set to 16 VFs 00:05:21.070 Properly configured the qat device with driver uio_pci_generic. 00:05:21.070 11:41:48 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:21.070 11:41:48 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:21.070 11:41:48 -- common/autotest_common.sh@10 -- # set +x 00:05:21.070 11:41:48 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:21.070 11:41:48 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:21.070 11:41:48 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:21.070 11:41:48 -- common/autotest_common.sh@10 -- # set +x 00:05:21.328 ************************************ 00:05:21.328 START TEST env 00:05:21.328 ************************************ 00:05:21.328 11:41:48 env -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:21.328 * Looking for test storage... 00:05:21.328 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:21.328 11:41:48 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:21.328 11:41:48 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:21.328 11:41:48 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:21.328 11:41:48 env -- common/autotest_common.sh@10 -- # set +x 00:05:21.328 ************************************ 00:05:21.328 START TEST env_memory 00:05:21.328 ************************************ 00:05:21.328 11:41:48 env.env_memory -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:21.328 00:05:21.328 00:05:21.328 CUnit - A unit testing framework for C - Version 2.1-3 00:05:21.328 http://cunit.sourceforge.net/ 00:05:21.328 00:05:21.328 00:05:21.328 Suite: memory 00:05:21.328 Test: alloc and free memory map ...[2024-05-14 11:41:48.410190] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:21.587 passed 00:05:21.587 Test: mem map translation ...[2024-05-14 11:41:48.439485] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:21.587 [2024-05-14 11:41:48.439508] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:21.587 [2024-05-14 11:41:48.439563] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:21.587 [2024-05-14 11:41:48.439577] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:21.587 passed 00:05:21.587 Test: mem map registration ...[2024-05-14 11:41:48.497472] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:21.587 [2024-05-14 11:41:48.497494] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:21.587 passed 00:05:21.587 Test: mem map adjacent registrations ...passed 00:05:21.587 00:05:21.587 Run Summary: Type Total Ran Passed Failed Inactive 00:05:21.587 suites 1 1 n/a 0 0 00:05:21.587 tests 4 4 4 0 0 00:05:21.587 asserts 152 152 152 0 n/a 00:05:21.587 00:05:21.587 Elapsed time = 0.201 seconds 00:05:21.587 00:05:21.587 real 0m0.216s 00:05:21.587 user 0m0.200s 00:05:21.587 sys 0m0.015s 00:05:21.587 11:41:48 env.env_memory -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:21.587 11:41:48 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:21.587 ************************************ 00:05:21.587 END TEST env_memory 00:05:21.587 ************************************ 00:05:21.587 11:41:48 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:21.587 11:41:48 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:21.587 11:41:48 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:21.587 11:41:48 env -- common/autotest_common.sh@10 -- # set +x 00:05:21.847 ************************************ 00:05:21.847 START TEST env_vtophys 00:05:21.847 ************************************ 00:05:21.847 11:41:48 env.env_vtophys -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:21.847 EAL: lib.eal log level changed from notice to debug 00:05:21.847 EAL: Detected lcore 0 as core 0 on socket 0 00:05:21.847 EAL: Detected lcore 1 as core 1 on socket 0 00:05:21.847 EAL: Detected lcore 2 as core 2 on socket 0 00:05:21.847 EAL: Detected lcore 3 as core 3 on socket 0 00:05:21.847 EAL: Detected lcore 4 as core 4 on socket 0 00:05:21.847 EAL: Detected lcore 5 as core 8 on socket 0 00:05:21.847 EAL: Detected lcore 6 as core 9 on socket 0 00:05:21.847 EAL: Detected lcore 7 as core 10 on socket 0 00:05:21.847 EAL: Detected lcore 8 as core 11 on socket 0 00:05:21.847 EAL: Detected lcore 9 as core 16 on socket 0 00:05:21.847 EAL: Detected lcore 10 as core 17 on socket 0 00:05:21.847 EAL: Detected lcore 11 as core 18 on socket 0 00:05:21.847 EAL: Detected lcore 12 as core 19 on socket 0 00:05:21.847 EAL: Detected lcore 13 as core 20 on socket 0 00:05:21.847 EAL: Detected lcore 14 as core 24 on socket 0 00:05:21.847 EAL: Detected lcore 15 as core 25 on socket 0 00:05:21.847 EAL: Detected lcore 16 as core 26 on socket 0 00:05:21.847 EAL: Detected lcore 17 as core 27 on socket 0 00:05:21.847 EAL: Detected lcore 18 as core 0 on socket 1 00:05:21.847 EAL: Detected lcore 19 as core 1 on socket 1 00:05:21.847 EAL: Detected lcore 20 as core 2 on socket 1 00:05:21.847 EAL: Detected lcore 21 as core 3 on socket 1 00:05:21.847 EAL: Detected lcore 22 as core 4 on socket 1 00:05:21.847 EAL: Detected lcore 23 as core 8 on socket 1 00:05:21.847 EAL: Detected lcore 24 as core 9 on socket 1 00:05:21.847 EAL: Detected lcore 25 as core 10 on socket 1 00:05:21.847 EAL: Detected lcore 26 as core 11 on socket 1 00:05:21.847 EAL: Detected lcore 27 as core 16 on socket 1 00:05:21.847 EAL: Detected lcore 28 as core 17 on socket 1 00:05:21.847 EAL: Detected lcore 29 as core 18 on socket 1 00:05:21.847 EAL: Detected lcore 30 as core 19 on socket 1 00:05:21.847 EAL: Detected lcore 31 as core 20 on socket 1 00:05:21.847 EAL: Detected lcore 32 as core 24 on socket 1 00:05:21.847 EAL: Detected lcore 33 as core 25 on socket 1 00:05:21.847 EAL: Detected lcore 34 as core 26 on socket 1 00:05:21.847 EAL: Detected lcore 35 as core 27 on socket 1 00:05:21.847 EAL: Detected lcore 36 as core 0 on socket 0 00:05:21.847 EAL: Detected lcore 37 as core 1 on socket 0 00:05:21.847 EAL: Detected lcore 38 as core 2 on socket 0 00:05:21.847 EAL: Detected lcore 39 as core 3 on socket 0 00:05:21.847 EAL: Detected lcore 40 as core 4 on socket 0 00:05:21.847 EAL: Detected lcore 41 as core 8 on socket 0 00:05:21.847 EAL: Detected lcore 42 as core 9 on socket 0 00:05:21.847 EAL: Detected lcore 43 as core 10 on socket 0 00:05:21.847 EAL: Detected lcore 44 as core 11 on socket 0 00:05:21.847 EAL: Detected lcore 45 as core 16 on socket 0 00:05:21.847 EAL: Detected lcore 46 as core 17 on socket 0 00:05:21.847 EAL: Detected lcore 47 as core 18 on socket 0 00:05:21.847 EAL: Detected lcore 48 as core 19 on socket 0 00:05:21.847 EAL: Detected lcore 49 as core 20 on socket 0 00:05:21.848 EAL: Detected lcore 50 as core 24 on socket 0 00:05:21.848 EAL: Detected lcore 51 as core 25 on socket 0 00:05:21.848 EAL: Detected lcore 52 as core 26 on socket 0 00:05:21.848 EAL: Detected lcore 53 as core 27 on socket 0 00:05:21.848 EAL: Detected lcore 54 as core 0 on socket 1 00:05:21.848 EAL: Detected lcore 55 as core 1 on socket 1 00:05:21.848 EAL: Detected lcore 56 as core 2 on socket 1 00:05:21.848 EAL: Detected lcore 57 as core 3 on socket 1 00:05:21.848 EAL: Detected lcore 58 as core 4 on socket 1 00:05:21.848 EAL: Detected lcore 59 as core 8 on socket 1 00:05:21.848 EAL: Detected lcore 60 as core 9 on socket 1 00:05:21.848 EAL: Detected lcore 61 as core 10 on socket 1 00:05:21.848 EAL: Detected lcore 62 as core 11 on socket 1 00:05:21.848 EAL: Detected lcore 63 as core 16 on socket 1 00:05:21.848 EAL: Detected lcore 64 as core 17 on socket 1 00:05:21.848 EAL: Detected lcore 65 as core 18 on socket 1 00:05:21.848 EAL: Detected lcore 66 as core 19 on socket 1 00:05:21.848 EAL: Detected lcore 67 as core 20 on socket 1 00:05:21.848 EAL: Detected lcore 68 as core 24 on socket 1 00:05:21.848 EAL: Detected lcore 69 as core 25 on socket 1 00:05:21.848 EAL: Detected lcore 70 as core 26 on socket 1 00:05:21.848 EAL: Detected lcore 71 as core 27 on socket 1 00:05:21.848 EAL: Maximum logical cores by configuration: 128 00:05:21.848 EAL: Detected CPU lcores: 72 00:05:21.848 EAL: Detected NUMA nodes: 2 00:05:21.848 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:21.848 EAL: Detected shared linkage of DPDK 00:05:21.848 EAL: No shared files mode enabled, IPC will be disabled 00:05:21.848 EAL: No shared files mode enabled, IPC is disabled 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:05:21.848 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:05:21.848 EAL: Bus pci wants IOVA as 'PA' 00:05:21.848 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:21.848 EAL: Bus vdev wants IOVA as 'DC' 00:05:21.848 EAL: Selected IOVA mode 'PA' 00:05:21.848 EAL: Probing VFIO support... 00:05:21.848 EAL: IOMMU type 1 (Type 1) is supported 00:05:21.848 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:21.848 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:21.848 EAL: VFIO support initialized 00:05:21.848 EAL: Ask a virtual area of 0x2e000 bytes 00:05:21.848 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:21.848 EAL: Setting up physically contiguous memory... 00:05:21.848 EAL: Setting maximum number of open files to 524288 00:05:21.848 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:21.848 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:21.848 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:21.848 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:21.848 EAL: Ask a virtual area of 0x61000 bytes 00:05:21.848 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:21.848 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:21.848 EAL: Ask a virtual area of 0x400000000 bytes 00:05:21.848 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:21.848 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:21.848 EAL: Hugepages will be freed exactly as allocated. 00:05:21.848 EAL: No shared files mode enabled, IPC is disabled 00:05:21.848 EAL: No shared files mode enabled, IPC is disabled 00:05:21.848 EAL: TSC frequency is ~2300000 KHz 00:05:21.848 EAL: Main lcore 0 is ready (tid=7f167c469b00;cpuset=[0]) 00:05:21.848 EAL: Trying to obtain current memory policy. 00:05:21.848 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.848 EAL: Restoring previous memory policy: 0 00:05:21.848 EAL: request: mp_malloc_sync 00:05:21.848 EAL: No shared files mode enabled, IPC is disabled 00:05:21.848 EAL: Heap on socket 0 was expanded by 2MB 00:05:21.848 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001000000 00:05:21.848 EAL: PCI memory mapped at 0x202001001000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001002000 00:05:21.848 EAL: PCI memory mapped at 0x202001003000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001004000 00:05:21.848 EAL: PCI memory mapped at 0x202001005000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001006000 00:05:21.848 EAL: PCI memory mapped at 0x202001007000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001008000 00:05:21.848 EAL: PCI memory mapped at 0x202001009000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200100a000 00:05:21.848 EAL: PCI memory mapped at 0x20200100b000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200100c000 00:05:21.848 EAL: PCI memory mapped at 0x20200100d000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200100e000 00:05:21.848 EAL: PCI memory mapped at 0x20200100f000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001010000 00:05:21.848 EAL: PCI memory mapped at 0x202001011000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001012000 00:05:21.848 EAL: PCI memory mapped at 0x202001013000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001014000 00:05:21.848 EAL: PCI memory mapped at 0x202001015000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001016000 00:05:21.848 EAL: PCI memory mapped at 0x202001017000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001018000 00:05:21.848 EAL: PCI memory mapped at 0x202001019000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200101a000 00:05:21.848 EAL: PCI memory mapped at 0x20200101b000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200101c000 00:05:21.848 EAL: PCI memory mapped at 0x20200101d000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:21.848 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200101e000 00:05:21.848 EAL: PCI memory mapped at 0x20200101f000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001020000 00:05:21.848 EAL: PCI memory mapped at 0x202001021000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001022000 00:05:21.848 EAL: PCI memory mapped at 0x202001023000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001024000 00:05:21.848 EAL: PCI memory mapped at 0x202001025000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001026000 00:05:21.848 EAL: PCI memory mapped at 0x202001027000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x202001028000 00:05:21.848 EAL: PCI memory mapped at 0x202001029000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:05:21.848 EAL: probe driver: 8086:37c9 qat 00:05:21.848 EAL: PCI memory mapped at 0x20200102a000 00:05:21.848 EAL: PCI memory mapped at 0x20200102b000 00:05:21.848 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:21.848 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200102c000 00:05:21.849 EAL: PCI memory mapped at 0x20200102d000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200102e000 00:05:21.849 EAL: PCI memory mapped at 0x20200102f000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001030000 00:05:21.849 EAL: PCI memory mapped at 0x202001031000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001032000 00:05:21.849 EAL: PCI memory mapped at 0x202001033000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001034000 00:05:21.849 EAL: PCI memory mapped at 0x202001035000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001036000 00:05:21.849 EAL: PCI memory mapped at 0x202001037000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001038000 00:05:21.849 EAL: PCI memory mapped at 0x202001039000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200103a000 00:05:21.849 EAL: PCI memory mapped at 0x20200103b000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200103c000 00:05:21.849 EAL: PCI memory mapped at 0x20200103d000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:21.849 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200103e000 00:05:21.849 EAL: PCI memory mapped at 0x20200103f000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:21.849 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001040000 00:05:21.849 EAL: PCI memory mapped at 0x202001041000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 1 was expanded by 2MB 00:05:21.849 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001042000 00:05:21.849 EAL: PCI memory mapped at 0x202001043000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001044000 00:05:21.849 EAL: PCI memory mapped at 0x202001045000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001046000 00:05:21.849 EAL: PCI memory mapped at 0x202001047000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001048000 00:05:21.849 EAL: PCI memory mapped at 0x202001049000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200104a000 00:05:21.849 EAL: PCI memory mapped at 0x20200104b000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200104c000 00:05:21.849 EAL: PCI memory mapped at 0x20200104d000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200104e000 00:05:21.849 EAL: PCI memory mapped at 0x20200104f000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001050000 00:05:21.849 EAL: PCI memory mapped at 0x202001051000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001052000 00:05:21.849 EAL: PCI memory mapped at 0x202001053000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001054000 00:05:21.849 EAL: PCI memory mapped at 0x202001055000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001056000 00:05:21.849 EAL: PCI memory mapped at 0x202001057000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x202001058000 00:05:21.849 EAL: PCI memory mapped at 0x202001059000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200105a000 00:05:21.849 EAL: PCI memory mapped at 0x20200105b000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200105c000 00:05:21.849 EAL: PCI memory mapped at 0x20200105d000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:21.849 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:05:21.849 EAL: probe driver: 8086:37c9 qat 00:05:21.849 EAL: PCI memory mapped at 0x20200105e000 00:05:21.849 EAL: PCI memory mapped at 0x20200105f000 00:05:21.849 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:21.849 EAL: Mem event callback 'spdk:(nil)' registered 00:05:21.849 00:05:21.849 00:05:21.849 CUnit - A unit testing framework for C - Version 2.1-3 00:05:21.849 http://cunit.sourceforge.net/ 00:05:21.849 00:05:21.849 00:05:21.849 Suite: components_suite 00:05:21.849 Test: vtophys_malloc_test ...passed 00:05:21.849 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 4MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 4MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 6MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 6MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 10MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 10MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 18MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 18MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 34MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 34MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 66MB 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was shrunk by 66MB 00:05:21.849 EAL: Trying to obtain current memory policy. 00:05:21.849 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:21.849 EAL: Restoring previous memory policy: 4 00:05:21.849 EAL: Calling mem event callback 'spdk:(nil)' 00:05:21.849 EAL: request: mp_malloc_sync 00:05:21.849 EAL: No shared files mode enabled, IPC is disabled 00:05:21.849 EAL: Heap on socket 0 was expanded by 130MB 00:05:22.108 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.108 EAL: request: mp_malloc_sync 00:05:22.108 EAL: No shared files mode enabled, IPC is disabled 00:05:22.108 EAL: Heap on socket 0 was shrunk by 130MB 00:05:22.108 EAL: Trying to obtain current memory policy. 00:05:22.108 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:22.108 EAL: Restoring previous memory policy: 4 00:05:22.108 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.108 EAL: request: mp_malloc_sync 00:05:22.108 EAL: No shared files mode enabled, IPC is disabled 00:05:22.108 EAL: Heap on socket 0 was expanded by 258MB 00:05:22.108 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.108 EAL: request: mp_malloc_sync 00:05:22.108 EAL: No shared files mode enabled, IPC is disabled 00:05:22.108 EAL: Heap on socket 0 was shrunk by 258MB 00:05:22.108 EAL: Trying to obtain current memory policy. 00:05:22.108 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:22.368 EAL: Restoring previous memory policy: 4 00:05:22.368 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.368 EAL: request: mp_malloc_sync 00:05:22.368 EAL: No shared files mode enabled, IPC is disabled 00:05:22.368 EAL: Heap on socket 0 was expanded by 514MB 00:05:22.368 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.368 EAL: request: mp_malloc_sync 00:05:22.368 EAL: No shared files mode enabled, IPC is disabled 00:05:22.368 EAL: Heap on socket 0 was shrunk by 514MB 00:05:22.368 EAL: Trying to obtain current memory policy. 00:05:22.368 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:22.626 EAL: Restoring previous memory policy: 4 00:05:22.626 EAL: Calling mem event callback 'spdk:(nil)' 00:05:22.626 EAL: request: mp_malloc_sync 00:05:22.626 EAL: No shared files mode enabled, IPC is disabled 00:05:22.626 EAL: Heap on socket 0 was expanded by 1026MB 00:05:22.885 EAL: Calling mem event callback 'spdk:(nil)' 00:05:23.145 EAL: request: mp_malloc_sync 00:05:23.145 EAL: No shared files mode enabled, IPC is disabled 00:05:23.145 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:23.145 passed 00:05:23.145 00:05:23.145 Run Summary: Type Total Ran Passed Failed Inactive 00:05:23.145 suites 1 1 n/a 0 0 00:05:23.145 tests 2 2 2 0 0 00:05:23.145 asserts 6079 6079 6079 0 n/a 00:05:23.145 00:05:23.145 Elapsed time = 1.239 seconds 00:05:23.145 EAL: No shared files mode enabled, IPC is disabled 00:05:23.145 EAL: No shared files mode enabled, IPC is disabled 00:05:23.145 EAL: No shared files mode enabled, IPC is disabled 00:05:23.145 00:05:23.145 real 0m1.433s 00:05:23.145 user 0m0.798s 00:05:23.145 sys 0m0.600s 00:05:23.145 11:41:50 env.env_vtophys -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:23.145 11:41:50 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:23.145 ************************************ 00:05:23.145 END TEST env_vtophys 00:05:23.145 ************************************ 00:05:23.145 11:41:50 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:23.145 11:41:50 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:23.145 11:41:50 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:23.145 11:41:50 env -- common/autotest_common.sh@10 -- # set +x 00:05:23.145 ************************************ 00:05:23.145 START TEST env_pci 00:05:23.145 ************************************ 00:05:23.145 11:41:50 env.env_pci -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:23.145 00:05:23.145 00:05:23.145 CUnit - A unit testing framework for C - Version 2.1-3 00:05:23.145 http://cunit.sourceforge.net/ 00:05:23.145 00:05:23.145 00:05:23.145 Suite: pci 00:05:23.145 Test: pci_hook ...[2024-05-14 11:41:50.224451] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1622917 has claimed it 00:05:23.405 EAL: Cannot find device (10000:00:01.0) 00:05:23.405 EAL: Failed to attach device on primary process 00:05:23.405 passed 00:05:23.405 00:05:23.405 Run Summary: Type Total Ran Passed Failed Inactive 00:05:23.405 suites 1 1 n/a 0 0 00:05:23.405 tests 1 1 1 0 0 00:05:23.405 asserts 25 25 25 0 n/a 00:05:23.405 00:05:23.405 Elapsed time = 0.043 seconds 00:05:23.405 00:05:23.405 real 0m0.071s 00:05:23.405 user 0m0.020s 00:05:23.405 sys 0m0.051s 00:05:23.405 11:41:50 env.env_pci -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:23.405 11:41:50 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:23.405 ************************************ 00:05:23.405 END TEST env_pci 00:05:23.405 ************************************ 00:05:23.405 11:41:50 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:23.405 11:41:50 env -- env/env.sh@15 -- # uname 00:05:23.405 11:41:50 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:23.405 11:41:50 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:23.405 11:41:50 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:23.405 11:41:50 env -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:05:23.405 11:41:50 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:23.405 11:41:50 env -- common/autotest_common.sh@10 -- # set +x 00:05:23.405 ************************************ 00:05:23.405 START TEST env_dpdk_post_init 00:05:23.405 ************************************ 00:05:23.405 11:41:50 env.env_dpdk_post_init -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:23.405 EAL: Detected CPU lcores: 72 00:05:23.405 EAL: Detected NUMA nodes: 2 00:05:23.405 EAL: Detected shared linkage of DPDK 00:05:23.405 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:23.405 EAL: Selected IOVA mode 'PA' 00:05:23.405 EAL: VFIO support initialized 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:23.405 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:23.405 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:23.406 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.406 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:23.407 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:23.407 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:23.407 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:23.666 EAL: Using IOMMU type 1 (Type 1) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.666 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:23.666 EAL: Ignore mapping IO port bar(1) 00:05:23.667 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:23.667 EAL: Ignore mapping IO port bar(1) 00:05:23.667 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:23.926 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:23.926 EAL: Ignore mapping IO port bar(1) 00:05:23.926 EAL: Ignore mapping IO port bar(5) 00:05:23.926 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:05:24.185 EAL: Ignore mapping IO port bar(1) 00:05:24.185 EAL: Ignore mapping IO port bar(5) 00:05:24.185 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:05:26.719 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:26.719 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:26.719 Starting DPDK initialization... 00:05:26.719 Starting SPDK post initialization... 00:05:26.719 SPDK NVMe probe 00:05:26.719 Attaching to 0000:5e:00.0 00:05:26.719 Attached to 0000:5e:00.0 00:05:26.719 Cleaning up... 00:05:26.719 00:05:26.719 real 0m3.290s 00:05:26.719 user 0m2.198s 00:05:26.719 sys 0m0.650s 00:05:26.719 11:41:53 env.env_dpdk_post_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:26.719 11:41:53 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:26.719 ************************************ 00:05:26.719 END TEST env_dpdk_post_init 00:05:26.719 ************************************ 00:05:26.719 11:41:53 env -- env/env.sh@26 -- # uname 00:05:26.719 11:41:53 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:26.719 11:41:53 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:26.719 11:41:53 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:26.719 11:41:53 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:26.719 11:41:53 env -- common/autotest_common.sh@10 -- # set +x 00:05:26.719 ************************************ 00:05:26.719 START TEST env_mem_callbacks 00:05:26.719 ************************************ 00:05:26.719 11:41:53 env.env_mem_callbacks -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:26.719 EAL: Detected CPU lcores: 72 00:05:26.719 EAL: Detected NUMA nodes: 2 00:05:26.719 EAL: Detected shared linkage of DPDK 00:05:26.719 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:26.978 EAL: Selected IOVA mode 'PA' 00:05:26.978 EAL: VFIO support initialized 00:05:26.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:26.978 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:26.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.978 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:26.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:26.978 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:26.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.978 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:26.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:26.979 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:26.979 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:26.980 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:26.980 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:26.980 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:26.980 00:05:26.980 00:05:26.980 CUnit - A unit testing framework for C - Version 2.1-3 00:05:26.980 http://cunit.sourceforge.net/ 00:05:26.980 00:05:26.980 00:05:26.980 Suite: memory 00:05:26.980 Test: test ... 00:05:26.980 register 0x200000200000 2097152 00:05:26.980 register 0x201000a00000 2097152 00:05:26.980 malloc 3145728 00:05:26.980 register 0x200000400000 4194304 00:05:26.980 buf 0x200000500000 len 3145728 PASSED 00:05:26.980 malloc 64 00:05:26.980 buf 0x2000004fff40 len 64 PASSED 00:05:26.980 malloc 4194304 00:05:26.980 register 0x200000800000 6291456 00:05:26.980 buf 0x200000a00000 len 4194304 PASSED 00:05:26.980 free 0x200000500000 3145728 00:05:26.980 free 0x2000004fff40 64 00:05:26.980 unregister 0x200000400000 4194304 PASSED 00:05:26.980 free 0x200000a00000 4194304 00:05:26.980 unregister 0x200000800000 6291456 PASSED 00:05:26.980 malloc 8388608 00:05:26.980 register 0x200000400000 10485760 00:05:26.980 buf 0x200000600000 len 8388608 PASSED 00:05:26.980 free 0x200000600000 8388608 00:05:26.980 unregister 0x200000400000 10485760 PASSED 00:05:26.980 passed 00:05:26.980 00:05:26.980 Run Summary: Type Total Ran Passed Failed Inactive 00:05:26.980 suites 1 1 n/a 0 0 00:05:26.980 tests 1 1 1 0 0 00:05:26.980 asserts 16 16 16 0 n/a 00:05:26.980 00:05:26.980 Elapsed time = 0.007 seconds 00:05:26.980 00:05:26.980 real 0m0.107s 00:05:26.980 user 0m0.027s 00:05:26.980 sys 0m0.079s 00:05:26.980 11:41:53 env.env_mem_callbacks -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:26.980 11:41:53 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:26.980 ************************************ 00:05:26.980 END TEST env_mem_callbacks 00:05:26.980 ************************************ 00:05:26.980 00:05:26.980 real 0m5.705s 00:05:26.980 user 0m3.430s 00:05:26.980 sys 0m1.817s 00:05:26.980 11:41:53 env -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:26.980 11:41:53 env -- common/autotest_common.sh@10 -- # set +x 00:05:26.980 ************************************ 00:05:26.980 END TEST env 00:05:26.980 ************************************ 00:05:26.980 11:41:53 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:26.980 11:41:53 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:26.980 11:41:53 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:26.980 11:41:53 -- common/autotest_common.sh@10 -- # set +x 00:05:26.980 ************************************ 00:05:26.980 START TEST rpc 00:05:26.980 ************************************ 00:05:26.980 11:41:54 rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:27.239 * Looking for test storage... 00:05:27.239 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:27.239 11:41:54 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1623567 00:05:27.239 11:41:54 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:27.239 11:41:54 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1623567 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@827 -- # '[' -z 1623567 ']' 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.239 11:41:54 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:27.239 11:41:54 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:27.239 [2024-05-14 11:41:54.179663] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:27.239 [2024-05-14 11:41:54.179742] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623567 ] 00:05:27.239 [2024-05-14 11:41:54.308177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:27.499 [2024-05-14 11:41:54.412732] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:27.499 [2024-05-14 11:41:54.412779] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1623567' to capture a snapshot of events at runtime. 00:05:27.499 [2024-05-14 11:41:54.412794] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:27.499 [2024-05-14 11:41:54.412807] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:27.499 [2024-05-14 11:41:54.412818] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1623567 for offline analysis/debug. 00:05:27.499 [2024-05-14 11:41:54.412858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:28.065 11:41:55 rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:28.065 11:41:55 rpc -- common/autotest_common.sh@860 -- # return 0 00:05:28.065 11:41:55 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:28.065 11:41:55 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:28.066 11:41:55 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:28.066 11:41:55 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:28.066 11:41:55 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:28.066 11:41:55 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:28.066 11:41:55 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:28.066 ************************************ 00:05:28.066 START TEST rpc_integrity 00:05:28.066 ************************************ 00:05:28.066 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:05:28.066 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:28.066 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.066 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.324 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:28.324 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.324 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.324 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:28.324 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:28.324 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:28.325 { 00:05:28.325 "name": "Malloc0", 00:05:28.325 "aliases": [ 00:05:28.325 "115a5a36-498b-47a1-b715-d32813d68301" 00:05:28.325 ], 00:05:28.325 "product_name": "Malloc disk", 00:05:28.325 "block_size": 512, 00:05:28.325 "num_blocks": 16384, 00:05:28.325 "uuid": "115a5a36-498b-47a1-b715-d32813d68301", 00:05:28.325 "assigned_rate_limits": { 00:05:28.325 "rw_ios_per_sec": 0, 00:05:28.325 "rw_mbytes_per_sec": 0, 00:05:28.325 "r_mbytes_per_sec": 0, 00:05:28.325 "w_mbytes_per_sec": 0 00:05:28.325 }, 00:05:28.325 "claimed": false, 00:05:28.325 "zoned": false, 00:05:28.325 "supported_io_types": { 00:05:28.325 "read": true, 00:05:28.325 "write": true, 00:05:28.325 "unmap": true, 00:05:28.325 "write_zeroes": true, 00:05:28.325 "flush": true, 00:05:28.325 "reset": true, 00:05:28.325 "compare": false, 00:05:28.325 "compare_and_write": false, 00:05:28.325 "abort": true, 00:05:28.325 "nvme_admin": false, 00:05:28.325 "nvme_io": false 00:05:28.325 }, 00:05:28.325 "memory_domains": [ 00:05:28.325 { 00:05:28.325 "dma_device_id": "system", 00:05:28.325 "dma_device_type": 1 00:05:28.325 }, 00:05:28.325 { 00:05:28.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.325 "dma_device_type": 2 00:05:28.325 } 00:05:28.325 ], 00:05:28.325 "driver_specific": {} 00:05:28.325 } 00:05:28.325 ]' 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 [2024-05-14 11:41:55.281162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:28.325 [2024-05-14 11:41:55.281200] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:28.325 [2024-05-14 11:41:55.281218] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2729650 00:05:28.325 [2024-05-14 11:41:55.281230] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:28.325 [2024-05-14 11:41:55.282786] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:28.325 [2024-05-14 11:41:55.282813] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:28.325 Passthru0 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:28.325 { 00:05:28.325 "name": "Malloc0", 00:05:28.325 "aliases": [ 00:05:28.325 "115a5a36-498b-47a1-b715-d32813d68301" 00:05:28.325 ], 00:05:28.325 "product_name": "Malloc disk", 00:05:28.325 "block_size": 512, 00:05:28.325 "num_blocks": 16384, 00:05:28.325 "uuid": "115a5a36-498b-47a1-b715-d32813d68301", 00:05:28.325 "assigned_rate_limits": { 00:05:28.325 "rw_ios_per_sec": 0, 00:05:28.325 "rw_mbytes_per_sec": 0, 00:05:28.325 "r_mbytes_per_sec": 0, 00:05:28.325 "w_mbytes_per_sec": 0 00:05:28.325 }, 00:05:28.325 "claimed": true, 00:05:28.325 "claim_type": "exclusive_write", 00:05:28.325 "zoned": false, 00:05:28.325 "supported_io_types": { 00:05:28.325 "read": true, 00:05:28.325 "write": true, 00:05:28.325 "unmap": true, 00:05:28.325 "write_zeroes": true, 00:05:28.325 "flush": true, 00:05:28.325 "reset": true, 00:05:28.325 "compare": false, 00:05:28.325 "compare_and_write": false, 00:05:28.325 "abort": true, 00:05:28.325 "nvme_admin": false, 00:05:28.325 "nvme_io": false 00:05:28.325 }, 00:05:28.325 "memory_domains": [ 00:05:28.325 { 00:05:28.325 "dma_device_id": "system", 00:05:28.325 "dma_device_type": 1 00:05:28.325 }, 00:05:28.325 { 00:05:28.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.325 "dma_device_type": 2 00:05:28.325 } 00:05:28.325 ], 00:05:28.325 "driver_specific": {} 00:05:28.325 }, 00:05:28.325 { 00:05:28.325 "name": "Passthru0", 00:05:28.325 "aliases": [ 00:05:28.325 "0f13d566-1e9b-5ae5-be22-0077ab9215c2" 00:05:28.325 ], 00:05:28.325 "product_name": "passthru", 00:05:28.325 "block_size": 512, 00:05:28.325 "num_blocks": 16384, 00:05:28.325 "uuid": "0f13d566-1e9b-5ae5-be22-0077ab9215c2", 00:05:28.325 "assigned_rate_limits": { 00:05:28.325 "rw_ios_per_sec": 0, 00:05:28.325 "rw_mbytes_per_sec": 0, 00:05:28.325 "r_mbytes_per_sec": 0, 00:05:28.325 "w_mbytes_per_sec": 0 00:05:28.325 }, 00:05:28.325 "claimed": false, 00:05:28.325 "zoned": false, 00:05:28.325 "supported_io_types": { 00:05:28.325 "read": true, 00:05:28.325 "write": true, 00:05:28.325 "unmap": true, 00:05:28.325 "write_zeroes": true, 00:05:28.325 "flush": true, 00:05:28.325 "reset": true, 00:05:28.325 "compare": false, 00:05:28.325 "compare_and_write": false, 00:05:28.325 "abort": true, 00:05:28.325 "nvme_admin": false, 00:05:28.325 "nvme_io": false 00:05:28.325 }, 00:05:28.325 "memory_domains": [ 00:05:28.325 { 00:05:28.325 "dma_device_id": "system", 00:05:28.325 "dma_device_type": 1 00:05:28.325 }, 00:05:28.325 { 00:05:28.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.325 "dma_device_type": 2 00:05:28.325 } 00:05:28.325 ], 00:05:28.325 "driver_specific": { 00:05:28.325 "passthru": { 00:05:28.325 "name": "Passthru0", 00:05:28.325 "base_bdev_name": "Malloc0" 00:05:28.325 } 00:05:28.325 } 00:05:28.325 } 00:05:28.325 ]' 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.325 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:28.325 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:28.584 11:41:55 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:28.584 00:05:28.584 real 0m0.297s 00:05:28.584 user 0m0.181s 00:05:28.584 sys 0m0.050s 00:05:28.584 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:28.584 11:41:55 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:28.584 ************************************ 00:05:28.584 END TEST rpc_integrity 00:05:28.584 ************************************ 00:05:28.584 11:41:55 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:28.584 11:41:55 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:28.584 11:41:55 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:28.584 11:41:55 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:28.584 ************************************ 00:05:28.584 START TEST rpc_plugins 00:05:28.584 ************************************ 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@1121 -- # rpc_plugins 00:05:28.584 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.584 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:28.584 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:28.584 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.584 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:28.584 { 00:05:28.584 "name": "Malloc1", 00:05:28.584 "aliases": [ 00:05:28.584 "72fea24d-6650-4a9c-b0f9-946c59eb14a2" 00:05:28.584 ], 00:05:28.584 "product_name": "Malloc disk", 00:05:28.584 "block_size": 4096, 00:05:28.584 "num_blocks": 256, 00:05:28.584 "uuid": "72fea24d-6650-4a9c-b0f9-946c59eb14a2", 00:05:28.584 "assigned_rate_limits": { 00:05:28.584 "rw_ios_per_sec": 0, 00:05:28.584 "rw_mbytes_per_sec": 0, 00:05:28.584 "r_mbytes_per_sec": 0, 00:05:28.584 "w_mbytes_per_sec": 0 00:05:28.584 }, 00:05:28.584 "claimed": false, 00:05:28.584 "zoned": false, 00:05:28.584 "supported_io_types": { 00:05:28.584 "read": true, 00:05:28.584 "write": true, 00:05:28.584 "unmap": true, 00:05:28.584 "write_zeroes": true, 00:05:28.584 "flush": true, 00:05:28.584 "reset": true, 00:05:28.584 "compare": false, 00:05:28.584 "compare_and_write": false, 00:05:28.584 "abort": true, 00:05:28.584 "nvme_admin": false, 00:05:28.584 "nvme_io": false 00:05:28.584 }, 00:05:28.584 "memory_domains": [ 00:05:28.584 { 00:05:28.584 "dma_device_id": "system", 00:05:28.584 "dma_device_type": 1 00:05:28.584 }, 00:05:28.584 { 00:05:28.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.584 "dma_device_type": 2 00:05:28.584 } 00:05:28.584 ], 00:05:28.584 "driver_specific": {} 00:05:28.585 } 00:05:28.585 ]' 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:28.585 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:28.585 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:28.844 11:41:55 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:28.844 00:05:28.844 real 0m0.146s 00:05:28.844 user 0m0.096s 00:05:28.844 sys 0m0.017s 00:05:28.844 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:28.844 11:41:55 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:28.844 ************************************ 00:05:28.844 END TEST rpc_plugins 00:05:28.844 ************************************ 00:05:28.844 11:41:55 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:28.844 11:41:55 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:28.844 11:41:55 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:28.844 11:41:55 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:28.844 ************************************ 00:05:28.844 START TEST rpc_trace_cmd_test 00:05:28.844 ************************************ 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1121 -- # rpc_trace_cmd_test 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:28.844 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1623567", 00:05:28.844 "tpoint_group_mask": "0x8", 00:05:28.844 "iscsi_conn": { 00:05:28.844 "mask": "0x2", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "scsi": { 00:05:28.844 "mask": "0x4", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "bdev": { 00:05:28.844 "mask": "0x8", 00:05:28.844 "tpoint_mask": "0xffffffffffffffff" 00:05:28.844 }, 00:05:28.844 "nvmf_rdma": { 00:05:28.844 "mask": "0x10", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "nvmf_tcp": { 00:05:28.844 "mask": "0x20", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "ftl": { 00:05:28.844 "mask": "0x40", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "blobfs": { 00:05:28.844 "mask": "0x80", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "dsa": { 00:05:28.844 "mask": "0x200", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "thread": { 00:05:28.844 "mask": "0x400", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "nvme_pcie": { 00:05:28.844 "mask": "0x800", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "iaa": { 00:05:28.844 "mask": "0x1000", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "nvme_tcp": { 00:05:28.844 "mask": "0x2000", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "bdev_nvme": { 00:05:28.844 "mask": "0x4000", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 }, 00:05:28.844 "sock": { 00:05:28.844 "mask": "0x8000", 00:05:28.844 "tpoint_mask": "0x0" 00:05:28.844 } 00:05:28.844 }' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:28.844 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:29.103 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:29.103 11:41:55 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:29.103 11:41:56 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:29.103 00:05:29.103 real 0m0.250s 00:05:29.103 user 0m0.205s 00:05:29.103 sys 0m0.039s 00:05:29.103 11:41:56 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:29.103 11:41:56 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:29.103 ************************************ 00:05:29.103 END TEST rpc_trace_cmd_test 00:05:29.103 ************************************ 00:05:29.103 11:41:56 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:29.103 11:41:56 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:29.103 11:41:56 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:29.103 11:41:56 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:29.103 11:41:56 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:29.103 11:41:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:29.103 ************************************ 00:05:29.103 START TEST rpc_daemon_integrity 00:05:29.103 ************************************ 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.103 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:29.360 { 00:05:29.360 "name": "Malloc2", 00:05:29.360 "aliases": [ 00:05:29.360 "a46dfa7e-8b11-4f98-8653-ecfe7a2912a0" 00:05:29.360 ], 00:05:29.360 "product_name": "Malloc disk", 00:05:29.360 "block_size": 512, 00:05:29.360 "num_blocks": 16384, 00:05:29.360 "uuid": "a46dfa7e-8b11-4f98-8653-ecfe7a2912a0", 00:05:29.360 "assigned_rate_limits": { 00:05:29.360 "rw_ios_per_sec": 0, 00:05:29.360 "rw_mbytes_per_sec": 0, 00:05:29.360 "r_mbytes_per_sec": 0, 00:05:29.360 "w_mbytes_per_sec": 0 00:05:29.360 }, 00:05:29.360 "claimed": false, 00:05:29.360 "zoned": false, 00:05:29.360 "supported_io_types": { 00:05:29.360 "read": true, 00:05:29.360 "write": true, 00:05:29.360 "unmap": true, 00:05:29.360 "write_zeroes": true, 00:05:29.360 "flush": true, 00:05:29.360 "reset": true, 00:05:29.360 "compare": false, 00:05:29.360 "compare_and_write": false, 00:05:29.360 "abort": true, 00:05:29.360 "nvme_admin": false, 00:05:29.360 "nvme_io": false 00:05:29.360 }, 00:05:29.360 "memory_domains": [ 00:05:29.360 { 00:05:29.360 "dma_device_id": "system", 00:05:29.360 "dma_device_type": 1 00:05:29.360 }, 00:05:29.360 { 00:05:29.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.360 "dma_device_type": 2 00:05:29.360 } 00:05:29.360 ], 00:05:29.360 "driver_specific": {} 00:05:29.360 } 00:05:29.360 ]' 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.360 [2024-05-14 11:41:56.247925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:29.360 [2024-05-14 11:41:56.247960] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:29.360 [2024-05-14 11:41:56.247981] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28c1280 00:05:29.360 [2024-05-14 11:41:56.247995] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:29.360 [2024-05-14 11:41:56.249573] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:29.360 [2024-05-14 11:41:56.249602] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:29.360 Passthru0 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.360 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:29.360 { 00:05:29.360 "name": "Malloc2", 00:05:29.360 "aliases": [ 00:05:29.360 "a46dfa7e-8b11-4f98-8653-ecfe7a2912a0" 00:05:29.360 ], 00:05:29.360 "product_name": "Malloc disk", 00:05:29.360 "block_size": 512, 00:05:29.360 "num_blocks": 16384, 00:05:29.361 "uuid": "a46dfa7e-8b11-4f98-8653-ecfe7a2912a0", 00:05:29.361 "assigned_rate_limits": { 00:05:29.361 "rw_ios_per_sec": 0, 00:05:29.361 "rw_mbytes_per_sec": 0, 00:05:29.361 "r_mbytes_per_sec": 0, 00:05:29.361 "w_mbytes_per_sec": 0 00:05:29.361 }, 00:05:29.361 "claimed": true, 00:05:29.361 "claim_type": "exclusive_write", 00:05:29.361 "zoned": false, 00:05:29.361 "supported_io_types": { 00:05:29.361 "read": true, 00:05:29.361 "write": true, 00:05:29.361 "unmap": true, 00:05:29.361 "write_zeroes": true, 00:05:29.361 "flush": true, 00:05:29.361 "reset": true, 00:05:29.361 "compare": false, 00:05:29.361 "compare_and_write": false, 00:05:29.361 "abort": true, 00:05:29.361 "nvme_admin": false, 00:05:29.361 "nvme_io": false 00:05:29.361 }, 00:05:29.361 "memory_domains": [ 00:05:29.361 { 00:05:29.361 "dma_device_id": "system", 00:05:29.361 "dma_device_type": 1 00:05:29.361 }, 00:05:29.361 { 00:05:29.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.361 "dma_device_type": 2 00:05:29.361 } 00:05:29.361 ], 00:05:29.361 "driver_specific": {} 00:05:29.361 }, 00:05:29.361 { 00:05:29.361 "name": "Passthru0", 00:05:29.361 "aliases": [ 00:05:29.361 "7af81bc9-862d-567b-b241-cf1c66907dfa" 00:05:29.361 ], 00:05:29.361 "product_name": "passthru", 00:05:29.361 "block_size": 512, 00:05:29.361 "num_blocks": 16384, 00:05:29.361 "uuid": "7af81bc9-862d-567b-b241-cf1c66907dfa", 00:05:29.361 "assigned_rate_limits": { 00:05:29.361 "rw_ios_per_sec": 0, 00:05:29.361 "rw_mbytes_per_sec": 0, 00:05:29.361 "r_mbytes_per_sec": 0, 00:05:29.361 "w_mbytes_per_sec": 0 00:05:29.361 }, 00:05:29.361 "claimed": false, 00:05:29.361 "zoned": false, 00:05:29.361 "supported_io_types": { 00:05:29.361 "read": true, 00:05:29.361 "write": true, 00:05:29.361 "unmap": true, 00:05:29.361 "write_zeroes": true, 00:05:29.361 "flush": true, 00:05:29.361 "reset": true, 00:05:29.361 "compare": false, 00:05:29.361 "compare_and_write": false, 00:05:29.361 "abort": true, 00:05:29.361 "nvme_admin": false, 00:05:29.361 "nvme_io": false 00:05:29.361 }, 00:05:29.361 "memory_domains": [ 00:05:29.361 { 00:05:29.361 "dma_device_id": "system", 00:05:29.361 "dma_device_type": 1 00:05:29.361 }, 00:05:29.361 { 00:05:29.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.361 "dma_device_type": 2 00:05:29.361 } 00:05:29.361 ], 00:05:29.361 "driver_specific": { 00:05:29.361 "passthru": { 00:05:29.361 "name": "Passthru0", 00:05:29.361 "base_bdev_name": "Malloc2" 00:05:29.361 } 00:05:29.361 } 00:05:29.361 } 00:05:29.361 ]' 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:29.361 00:05:29.361 real 0m0.300s 00:05:29.361 user 0m0.188s 00:05:29.361 sys 0m0.047s 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:29.361 11:41:56 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:29.361 ************************************ 00:05:29.361 END TEST rpc_daemon_integrity 00:05:29.361 ************************************ 00:05:29.361 11:41:56 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:29.361 11:41:56 rpc -- rpc/rpc.sh@84 -- # killprocess 1623567 00:05:29.361 11:41:56 rpc -- common/autotest_common.sh@946 -- # '[' -z 1623567 ']' 00:05:29.361 11:41:56 rpc -- common/autotest_common.sh@950 -- # kill -0 1623567 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@951 -- # uname 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1623567 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1623567' 00:05:29.619 killing process with pid 1623567 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@965 -- # kill 1623567 00:05:29.619 11:41:56 rpc -- common/autotest_common.sh@970 -- # wait 1623567 00:05:29.887 00:05:29.887 real 0m2.869s 00:05:29.887 user 0m3.646s 00:05:29.887 sys 0m0.900s 00:05:29.887 11:41:56 rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:29.887 11:41:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:29.887 ************************************ 00:05:29.887 END TEST rpc 00:05:29.887 ************************************ 00:05:29.887 11:41:56 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:29.887 11:41:56 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:29.887 11:41:56 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:29.887 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:05:30.183 ************************************ 00:05:30.183 START TEST skip_rpc 00:05:30.183 ************************************ 00:05:30.183 11:41:56 skip_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:30.183 * Looking for test storage... 00:05:30.183 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:30.183 11:41:57 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:30.183 11:41:57 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:30.183 11:41:57 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:30.183 11:41:57 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:30.183 11:41:57 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:30.183 11:41:57 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:30.183 ************************************ 00:05:30.183 START TEST skip_rpc 00:05:30.183 ************************************ 00:05:30.183 11:41:57 skip_rpc.skip_rpc -- common/autotest_common.sh@1121 -- # test_skip_rpc 00:05:30.183 11:41:57 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:30.183 11:41:57 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1624098 00:05:30.183 11:41:57 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:30.183 11:41:57 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:30.183 [2024-05-14 11:41:57.181543] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:30.183 [2024-05-14 11:41:57.181603] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624098 ] 00:05:30.442 [2024-05-14 11:41:57.312310] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:30.442 [2024-05-14 11:41:57.410746] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1624098 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@946 -- # '[' -z 1624098 ']' 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # kill -0 1624098 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # uname 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1624098 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1624098' 00:05:35.712 killing process with pid 1624098 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@965 -- # kill 1624098 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@970 -- # wait 1624098 00:05:35.712 00:05:35.712 real 0m5.449s 00:05:35.712 user 0m5.095s 00:05:35.712 sys 0m0.366s 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:35.712 11:42:02 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.712 ************************************ 00:05:35.712 END TEST skip_rpc 00:05:35.712 ************************************ 00:05:35.712 11:42:02 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:35.712 11:42:02 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:35.712 11:42:02 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:35.712 11:42:02 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:35.712 ************************************ 00:05:35.712 START TEST skip_rpc_with_json 00:05:35.712 ************************************ 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_json 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1624956 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1624956 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@827 -- # '[' -z 1624956 ']' 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.712 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:35.712 11:42:02 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:35.712 [2024-05-14 11:42:02.719551] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:35.712 [2024-05-14 11:42:02.719618] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624956 ] 00:05:35.972 [2024-05-14 11:42:02.839423] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.972 [2024-05-14 11:42:02.946461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # return 0 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:36.908 [2024-05-14 11:42:03.640178] nvmf_rpc.c:2531:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:36.908 request: 00:05:36.908 { 00:05:36.908 "trtype": "tcp", 00:05:36.908 "method": "nvmf_get_transports", 00:05:36.908 "req_id": 1 00:05:36.908 } 00:05:36.908 Got JSON-RPC error response 00:05:36.908 response: 00:05:36.908 { 00:05:36.908 "code": -19, 00:05:36.908 "message": "No such device" 00:05:36.908 } 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:36.908 [2024-05-14 11:42:03.652310] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:36.908 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:36.908 { 00:05:36.908 "subsystems": [ 00:05:36.908 { 00:05:36.908 "subsystem": "keyring", 00:05:36.908 "config": [] 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "subsystem": "iobuf", 00:05:36.908 "config": [ 00:05:36.908 { 00:05:36.908 "method": "iobuf_set_options", 00:05:36.908 "params": { 00:05:36.908 "small_pool_count": 8192, 00:05:36.908 "large_pool_count": 1024, 00:05:36.908 "small_bufsize": 8192, 00:05:36.908 "large_bufsize": 135168 00:05:36.908 } 00:05:36.908 } 00:05:36.908 ] 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "subsystem": "sock", 00:05:36.908 "config": [ 00:05:36.908 { 00:05:36.908 "method": "sock_impl_set_options", 00:05:36.908 "params": { 00:05:36.908 "impl_name": "posix", 00:05:36.908 "recv_buf_size": 2097152, 00:05:36.908 "send_buf_size": 2097152, 00:05:36.908 "enable_recv_pipe": true, 00:05:36.908 "enable_quickack": false, 00:05:36.908 "enable_placement_id": 0, 00:05:36.908 "enable_zerocopy_send_server": true, 00:05:36.908 "enable_zerocopy_send_client": false, 00:05:36.908 "zerocopy_threshold": 0, 00:05:36.908 "tls_version": 0, 00:05:36.908 "enable_ktls": false 00:05:36.908 } 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "method": "sock_impl_set_options", 00:05:36.908 "params": { 00:05:36.908 "impl_name": "ssl", 00:05:36.908 "recv_buf_size": 4096, 00:05:36.908 "send_buf_size": 4096, 00:05:36.908 "enable_recv_pipe": true, 00:05:36.908 "enable_quickack": false, 00:05:36.908 "enable_placement_id": 0, 00:05:36.908 "enable_zerocopy_send_server": true, 00:05:36.908 "enable_zerocopy_send_client": false, 00:05:36.908 "zerocopy_threshold": 0, 00:05:36.908 "tls_version": 0, 00:05:36.908 "enable_ktls": false 00:05:36.908 } 00:05:36.908 } 00:05:36.908 ] 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "subsystem": "vmd", 00:05:36.908 "config": [] 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "subsystem": "accel", 00:05:36.908 "config": [ 00:05:36.908 { 00:05:36.908 "method": "accel_set_options", 00:05:36.908 "params": { 00:05:36.908 "small_cache_size": 128, 00:05:36.908 "large_cache_size": 16, 00:05:36.908 "task_count": 2048, 00:05:36.908 "sequence_count": 2048, 00:05:36.908 "buf_count": 2048 00:05:36.908 } 00:05:36.908 } 00:05:36.908 ] 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "subsystem": "bdev", 00:05:36.908 "config": [ 00:05:36.908 { 00:05:36.908 "method": "bdev_set_options", 00:05:36.908 "params": { 00:05:36.908 "bdev_io_pool_size": 65535, 00:05:36.908 "bdev_io_cache_size": 256, 00:05:36.908 "bdev_auto_examine": true, 00:05:36.908 "iobuf_small_cache_size": 128, 00:05:36.908 "iobuf_large_cache_size": 16 00:05:36.908 } 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "method": "bdev_raid_set_options", 00:05:36.908 "params": { 00:05:36.908 "process_window_size_kb": 1024 00:05:36.908 } 00:05:36.908 }, 00:05:36.908 { 00:05:36.908 "method": "bdev_iscsi_set_options", 00:05:36.908 "params": { 00:05:36.908 "timeout_sec": 30 00:05:36.908 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "bdev_nvme_set_options", 00:05:36.909 "params": { 00:05:36.909 "action_on_timeout": "none", 00:05:36.909 "timeout_us": 0, 00:05:36.909 "timeout_admin_us": 0, 00:05:36.909 "keep_alive_timeout_ms": 10000, 00:05:36.909 "arbitration_burst": 0, 00:05:36.909 "low_priority_weight": 0, 00:05:36.909 "medium_priority_weight": 0, 00:05:36.909 "high_priority_weight": 0, 00:05:36.909 "nvme_adminq_poll_period_us": 10000, 00:05:36.909 "nvme_ioq_poll_period_us": 0, 00:05:36.909 "io_queue_requests": 0, 00:05:36.909 "delay_cmd_submit": true, 00:05:36.909 "transport_retry_count": 4, 00:05:36.909 "bdev_retry_count": 3, 00:05:36.909 "transport_ack_timeout": 0, 00:05:36.909 "ctrlr_loss_timeout_sec": 0, 00:05:36.909 "reconnect_delay_sec": 0, 00:05:36.909 "fast_io_fail_timeout_sec": 0, 00:05:36.909 "disable_auto_failback": false, 00:05:36.909 "generate_uuids": false, 00:05:36.909 "transport_tos": 0, 00:05:36.909 "nvme_error_stat": false, 00:05:36.909 "rdma_srq_size": 0, 00:05:36.909 "io_path_stat": false, 00:05:36.909 "allow_accel_sequence": false, 00:05:36.909 "rdma_max_cq_size": 0, 00:05:36.909 "rdma_cm_event_timeout_ms": 0, 00:05:36.909 "dhchap_digests": [ 00:05:36.909 "sha256", 00:05:36.909 "sha384", 00:05:36.909 "sha512" 00:05:36.909 ], 00:05:36.909 "dhchap_dhgroups": [ 00:05:36.909 "null", 00:05:36.909 "ffdhe2048", 00:05:36.909 "ffdhe3072", 00:05:36.909 "ffdhe4096", 00:05:36.909 "ffdhe6144", 00:05:36.909 "ffdhe8192" 00:05:36.909 ] 00:05:36.909 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "bdev_nvme_set_hotplug", 00:05:36.909 "params": { 00:05:36.909 "period_us": 100000, 00:05:36.909 "enable": false 00:05:36.909 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "bdev_wait_for_examine" 00:05:36.909 } 00:05:36.909 ] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "scsi", 00:05:36.909 "config": null 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "scheduler", 00:05:36.909 "config": [ 00:05:36.909 { 00:05:36.909 "method": "framework_set_scheduler", 00:05:36.909 "params": { 00:05:36.909 "name": "static" 00:05:36.909 } 00:05:36.909 } 00:05:36.909 ] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "vhost_scsi", 00:05:36.909 "config": [] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "vhost_blk", 00:05:36.909 "config": [] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "ublk", 00:05:36.909 "config": [] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "nbd", 00:05:36.909 "config": [] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "nvmf", 00:05:36.909 "config": [ 00:05:36.909 { 00:05:36.909 "method": "nvmf_set_config", 00:05:36.909 "params": { 00:05:36.909 "discovery_filter": "match_any", 00:05:36.909 "admin_cmd_passthru": { 00:05:36.909 "identify_ctrlr": false 00:05:36.909 } 00:05:36.909 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "nvmf_set_max_subsystems", 00:05:36.909 "params": { 00:05:36.909 "max_subsystems": 1024 00:05:36.909 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "nvmf_set_crdt", 00:05:36.909 "params": { 00:05:36.909 "crdt1": 0, 00:05:36.909 "crdt2": 0, 00:05:36.909 "crdt3": 0 00:05:36.909 } 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "method": "nvmf_create_transport", 00:05:36.909 "params": { 00:05:36.909 "trtype": "TCP", 00:05:36.909 "max_queue_depth": 128, 00:05:36.909 "max_io_qpairs_per_ctrlr": 127, 00:05:36.909 "in_capsule_data_size": 4096, 00:05:36.909 "max_io_size": 131072, 00:05:36.909 "io_unit_size": 131072, 00:05:36.909 "max_aq_depth": 128, 00:05:36.909 "num_shared_buffers": 511, 00:05:36.909 "buf_cache_size": 4294967295, 00:05:36.909 "dif_insert_or_strip": false, 00:05:36.909 "zcopy": false, 00:05:36.909 "c2h_success": true, 00:05:36.909 "sock_priority": 0, 00:05:36.909 "abort_timeout_sec": 1, 00:05:36.909 "ack_timeout": 0, 00:05:36.909 "data_wr_pool_size": 0 00:05:36.909 } 00:05:36.909 } 00:05:36.909 ] 00:05:36.909 }, 00:05:36.909 { 00:05:36.909 "subsystem": "iscsi", 00:05:36.909 "config": [ 00:05:36.909 { 00:05:36.909 "method": "iscsi_set_options", 00:05:36.909 "params": { 00:05:36.909 "node_base": "iqn.2016-06.io.spdk", 00:05:36.909 "max_sessions": 128, 00:05:36.909 "max_connections_per_session": 2, 00:05:36.909 "max_queue_depth": 64, 00:05:36.909 "default_time2wait": 2, 00:05:36.909 "default_time2retain": 20, 00:05:36.909 "first_burst_length": 8192, 00:05:36.909 "immediate_data": true, 00:05:36.909 "allow_duplicated_isid": false, 00:05:36.909 "error_recovery_level": 0, 00:05:36.909 "nop_timeout": 60, 00:05:36.909 "nop_in_interval": 30, 00:05:36.909 "disable_chap": false, 00:05:36.909 "require_chap": false, 00:05:36.909 "mutual_chap": false, 00:05:36.909 "chap_group": 0, 00:05:36.909 "max_large_datain_per_connection": 64, 00:05:36.909 "max_r2t_per_connection": 4, 00:05:36.909 "pdu_pool_size": 36864, 00:05:36.909 "immediate_data_pool_size": 16384, 00:05:36.909 "data_out_pool_size": 2048 00:05:36.909 } 00:05:36.909 } 00:05:36.909 ] 00:05:36.909 } 00:05:36.909 ] 00:05:36.909 } 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1624956 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 1624956 ']' 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 1624956 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1624956 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1624956' 00:05:36.909 killing process with pid 1624956 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 1624956 00:05:36.909 11:42:03 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 1624956 00:05:37.478 11:42:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1625184 00:05:37.478 11:42:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:37.478 11:42:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 1625184 ']' 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1625184' 00:05:42.744 killing process with pid 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 1625184 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:42.744 00:05:42.744 real 0m7.064s 00:05:42.744 user 0m6.750s 00:05:42.744 sys 0m0.854s 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:42.744 ************************************ 00:05:42.744 END TEST skip_rpc_with_json 00:05:42.744 ************************************ 00:05:42.744 11:42:09 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:42.744 11:42:09 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:42.744 11:42:09 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:42.744 11:42:09 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:42.744 ************************************ 00:05:42.744 START TEST skip_rpc_with_delay 00:05:42.744 ************************************ 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_delay 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:42.744 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:43.004 [2024-05-14 11:42:09.873236] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:43.004 [2024-05-14 11:42:09.873313] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:43.004 00:05:43.004 real 0m0.081s 00:05:43.004 user 0m0.042s 00:05:43.004 sys 0m0.038s 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:43.004 11:42:09 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:43.004 ************************************ 00:05:43.004 END TEST skip_rpc_with_delay 00:05:43.004 ************************************ 00:05:43.004 11:42:09 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:43.004 11:42:09 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:43.004 11:42:09 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:43.004 11:42:09 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:43.004 11:42:09 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:43.004 11:42:09 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:43.004 ************************************ 00:05:43.004 START TEST exit_on_failed_rpc_init 00:05:43.004 ************************************ 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1121 -- # test_exit_on_failed_rpc_init 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1626434 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1626434 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@827 -- # '[' -z 1626434 ']' 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:43.004 11:42:09 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:43.004 [2024-05-14 11:42:10.054953] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:43.004 [2024-05-14 11:42:10.055023] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626434 ] 00:05:43.262 [2024-05-14 11:42:10.188538] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.262 [2024-05-14 11:42:10.291316] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # return 0 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:44.199 11:42:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:44.199 [2024-05-14 11:42:11.052043] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:44.199 [2024-05-14 11:42:11.052114] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626461 ] 00:05:44.199 [2024-05-14 11:42:11.173310] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.199 [2024-05-14 11:42:11.270824] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.199 [2024-05-14 11:42:11.270912] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:44.199 [2024-05-14 11:42:11.270929] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:44.199 [2024-05-14 11:42:11.270942] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1626434 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@946 -- # '[' -z 1626434 ']' 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # kill -0 1626434 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # uname 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1626434 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1626434' 00:05:44.458 killing process with pid 1626434 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@965 -- # kill 1626434 00:05:44.458 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@970 -- # wait 1626434 00:05:45.026 00:05:45.026 real 0m1.835s 00:05:45.026 user 0m2.101s 00:05:45.026 sys 0m0.623s 00:05:45.026 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:45.026 11:42:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:45.026 ************************************ 00:05:45.026 END TEST exit_on_failed_rpc_init 00:05:45.026 ************************************ 00:05:45.026 11:42:11 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:45.026 00:05:45.026 real 0m14.900s 00:05:45.026 user 0m14.151s 00:05:45.026 sys 0m2.206s 00:05:45.026 11:42:11 skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:45.026 11:42:11 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:45.026 ************************************ 00:05:45.026 END TEST skip_rpc 00:05:45.026 ************************************ 00:05:45.026 11:42:11 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:45.026 11:42:11 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:45.026 11:42:11 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:45.026 11:42:11 -- common/autotest_common.sh@10 -- # set +x 00:05:45.026 ************************************ 00:05:45.026 START TEST rpc_client 00:05:45.026 ************************************ 00:05:45.026 11:42:11 rpc_client -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:45.026 * Looking for test storage... 00:05:45.026 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:45.026 11:42:12 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:45.026 OK 00:05:45.026 11:42:12 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:45.026 00:05:45.026 real 0m0.140s 00:05:45.026 user 0m0.061s 00:05:45.026 sys 0m0.089s 00:05:45.026 11:42:12 rpc_client -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:45.026 11:42:12 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:45.026 ************************************ 00:05:45.026 END TEST rpc_client 00:05:45.026 ************************************ 00:05:45.285 11:42:12 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:45.285 11:42:12 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:45.285 11:42:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:45.285 11:42:12 -- common/autotest_common.sh@10 -- # set +x 00:05:45.285 ************************************ 00:05:45.285 START TEST json_config 00:05:45.285 ************************************ 00:05:45.285 11:42:12 json_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:45.285 11:42:12 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:45.285 11:42:12 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:45.285 11:42:12 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:45.285 11:42:12 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:45.285 11:42:12 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:45.285 11:42:12 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.285 11:42:12 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.285 11:42:12 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.285 11:42:12 json_config -- paths/export.sh@5 -- # export PATH 00:05:45.285 11:42:12 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@47 -- # : 0 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:45.286 11:42:12 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:05:45.286 INFO: JSON configuration test init 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:45.286 11:42:12 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:05:45.286 11:42:12 json_config -- json_config/common.sh@9 -- # local app=target 00:05:45.286 11:42:12 json_config -- json_config/common.sh@10 -- # shift 00:05:45.286 11:42:12 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:45.286 11:42:12 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:45.286 11:42:12 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:45.286 11:42:12 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:45.286 11:42:12 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:45.286 11:42:12 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1626746 00:05:45.286 11:42:12 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:45.286 Waiting for target to run... 00:05:45.286 11:42:12 json_config -- json_config/common.sh@25 -- # waitforlisten 1626746 /var/tmp/spdk_tgt.sock 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@827 -- # '[' -z 1626746 ']' 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:45.286 11:42:12 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:45.286 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:45.286 11:42:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:45.544 [2024-05-14 11:42:12.382737] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:05:45.544 [2024-05-14 11:42:12.382807] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626746 ] 00:05:45.802 [2024-05-14 11:42:12.771372] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.802 [2024-05-14 11:42:12.862539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.368 11:42:13 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:46.368 11:42:13 json_config -- common/autotest_common.sh@860 -- # return 0 00:05:46.368 11:42:13 json_config -- json_config/common.sh@26 -- # echo '' 00:05:46.368 00:05:46.368 11:42:13 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:05:46.368 11:42:13 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:05:46.368 11:42:13 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:46.368 11:42:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:46.368 11:42:13 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:05:46.368 11:42:13 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:46.369 11:42:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:46.627 11:42:13 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:46.627 11:42:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:46.886 [2024-05-14 11:42:13.765272] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:46.886 11:42:13 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:46.886 11:42:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:47.144 [2024-05-14 11:42:13.993853] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:47.144 11:42:14 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:05:47.144 11:42:14 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:47.144 11:42:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.144 11:42:14 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:47.144 11:42:14 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:05:47.144 11:42:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:47.403 [2024-05-14 11:42:14.307367] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:49.971 11:42:16 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:49.971 11:42:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:49.971 11:42:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:49.971 11:42:16 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:05:50.230 11:42:17 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:50.230 11:42:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@55 -- # return 0 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:05:50.230 11:42:17 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:50.230 11:42:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:50.230 11:42:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:50.230 11:42:17 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:05:50.488 11:42:17 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:50.488 11:42:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:50.748 Nvme0n1p0 Nvme0n1p1 00:05:50.748 11:42:17 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:50.748 11:42:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:51.068 [2024-05-14 11:42:17.920205] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:51.068 [2024-05-14 11:42:17.920258] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:51.068 00:05:51.068 11:42:17 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:51.068 11:42:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:51.327 Malloc3 00:05:51.327 11:42:18 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:51.327 11:42:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:51.327 [2024-05-14 11:42:18.389542] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:51.327 [2024-05-14 11:42:18.389589] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:51.327 [2024-05-14 11:42:18.389610] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7b7aa0 00:05:51.327 [2024-05-14 11:42:18.389622] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:51.327 [2024-05-14 11:42:18.391196] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:51.327 [2024-05-14 11:42:18.391225] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:51.327 PTBdevFromMalloc3 00:05:51.327 11:42:18 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:51.327 11:42:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:51.585 Null0 00:05:51.585 11:42:18 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:51.585 11:42:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:51.844 Malloc0 00:05:51.844 11:42:18 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:51.844 11:42:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:52.102 Malloc1 00:05:52.102 11:42:19 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:52.102 11:42:19 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:52.360 102400+0 records in 00:05:52.360 102400+0 records out 00:05:52.360 104857600 bytes (105 MB, 100 MiB) copied, 0.3056 s, 343 MB/s 00:05:52.360 11:42:19 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:52.360 11:42:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:52.619 aio_disk 00:05:52.620 11:42:19 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:52.620 11:42:19 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:52.620 11:42:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:57.890 593836ca-002f-42ca-bdbd-2e88627add85 00:05:57.890 11:42:24 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:57.890 11:42:24 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:57.890 11:42:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:57.890 11:42:24 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:57.890 11:42:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:57.890 11:42:24 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:57.890 11:42:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:58.149 11:42:25 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:58.149 11:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:58.408 11:42:25 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:05:58.408 11:42:25 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:58.408 11:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:58.666 MallocForCryptoBdev 00:05:58.666 11:42:25 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:05:58.666 11:42:25 json_config -- json_config/json_config.sh@159 -- # wc -l 00:05:58.666 11:42:25 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:05:58.666 11:42:25 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:05:58.666 11:42:25 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:58.666 11:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:58.925 [2024-05-14 11:42:25.836395] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:58.925 CryptoMallocBdev 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 bdev_register:d9f359da-532e-4079-baff-236a4dbda74a bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 bdev_register:d9f359da-532e-4079-baff-236a4dbda74a bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@71 -- # sort 00:05:58.925 11:42:25 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@72 -- # sort 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:58.926 11:42:25 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:58.926 11:42:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:d9f359da-532e-4079-baff-236a4dbda74a 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:d9f359da-532e-4079-baff-236a4dbda74a bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\2\9\1\4\5\9\e\-\6\f\3\1\-\4\f\9\8\-\a\0\f\f\-\5\1\6\5\6\4\4\3\5\9\b\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\6\1\a\8\4\3\7\-\5\1\4\5\-\4\f\5\d\-\b\b\2\a\-\0\8\3\d\f\8\a\4\8\d\a\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\d\b\3\5\b\a\6\-\e\0\b\3\-\4\0\b\f\-\b\8\3\8\-\7\e\6\7\8\4\4\a\b\1\9\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\9\f\3\5\9\d\a\-\5\3\2\e\-\4\0\7\9\-\b\a\f\f\-\2\3\6\a\4\d\b\d\a\7\4\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@86 -- # cat 00:05:59.185 11:42:26 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:d9f359da-532e-4079-baff-236a4dbda74a bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:59.185 Expected events matched: 00:05:59.185 bdev_register:0291459e-6f31-4f98-a0ff-5165644359b4 00:05:59.185 bdev_register:261a8437-5145-4f5d-bb2a-083df8a48da8 00:05:59.185 bdev_register:3db35ba6-e0b3-40bf-b838-7e67844ab197 00:05:59.185 bdev_register:aio_disk 00:05:59.185 bdev_register:CryptoMallocBdev 00:05:59.185 bdev_register:d9f359da-532e-4079-baff-236a4dbda74a 00:05:59.185 bdev_register:Malloc0 00:05:59.185 bdev_register:Malloc0p0 00:05:59.185 bdev_register:Malloc0p1 00:05:59.185 bdev_register:Malloc0p2 00:05:59.185 bdev_register:Malloc1 00:05:59.185 bdev_register:Malloc3 00:05:59.185 bdev_register:MallocForCryptoBdev 00:05:59.185 bdev_register:Null0 00:05:59.185 bdev_register:Nvme0n1 00:05:59.185 bdev_register:Nvme0n1p0 00:05:59.185 bdev_register:Nvme0n1p1 00:05:59.185 bdev_register:PTBdevFromMalloc3 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:05:59.186 11:42:26 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:59.186 11:42:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:05:59.186 11:42:26 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:59.186 11:42:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:05:59.186 11:42:26 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:59.186 11:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:59.445 MallocBdevForConfigChangeCheck 00:05:59.445 11:42:26 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:05:59.445 11:42:26 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:59.445 11:42:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:59.445 11:42:26 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:05:59.445 11:42:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:00.012 11:42:26 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:00.012 INFO: shutting down applications... 00:06:00.012 11:42:26 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:00.012 11:42:26 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:00.012 11:42:26 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:00.012 11:42:26 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:00.012 [2024-05-14 11:42:27.012057] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:03.297 Calling clear_iscsi_subsystem 00:06:03.297 Calling clear_nvmf_subsystem 00:06:03.297 Calling clear_nbd_subsystem 00:06:03.297 Calling clear_ublk_subsystem 00:06:03.297 Calling clear_vhost_blk_subsystem 00:06:03.297 Calling clear_vhost_scsi_subsystem 00:06:03.297 Calling clear_bdev_subsystem 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:03.297 11:42:29 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:03.556 11:42:30 json_config -- json_config/json_config.sh@345 -- # break 00:06:03.556 11:42:30 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:03.556 11:42:30 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:03.556 11:42:30 json_config -- json_config/common.sh@31 -- # local app=target 00:06:03.556 11:42:30 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:03.556 11:42:30 json_config -- json_config/common.sh@35 -- # [[ -n 1626746 ]] 00:06:03.556 11:42:30 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1626746 00:06:03.556 11:42:30 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:03.556 11:42:30 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:03.556 11:42:30 json_config -- json_config/common.sh@41 -- # kill -0 1626746 00:06:03.556 11:42:30 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:03.816 11:42:30 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:03.816 11:42:30 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:03.816 11:42:30 json_config -- json_config/common.sh@41 -- # kill -0 1626746 00:06:03.816 11:42:30 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:03.816 11:42:30 json_config -- json_config/common.sh@43 -- # break 00:06:03.816 11:42:30 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:03.816 11:42:30 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:03.816 SPDK target shutdown done 00:06:03.816 11:42:30 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:03.816 INFO: relaunching applications... 00:06:03.816 11:42:30 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:03.816 11:42:30 json_config -- json_config/common.sh@9 -- # local app=target 00:06:03.816 11:42:30 json_config -- json_config/common.sh@10 -- # shift 00:06:03.816 11:42:30 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:03.816 11:42:30 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:03.816 11:42:30 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:03.816 11:42:30 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:03.816 11:42:30 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:03.816 11:42:30 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1629362 00:06:03.816 11:42:30 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:03.816 Waiting for target to run... 00:06:03.816 11:42:30 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:03.816 11:42:30 json_config -- json_config/common.sh@25 -- # waitforlisten 1629362 /var/tmp/spdk_tgt.sock 00:06:03.816 11:42:30 json_config -- common/autotest_common.sh@827 -- # '[' -z 1629362 ']' 00:06:03.816 11:42:30 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:03.817 11:42:30 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:03.817 11:42:30 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:03.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:03.817 11:42:30 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:03.817 11:42:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:04.074 [2024-05-14 11:42:30.966555] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:04.074 [2024-05-14 11:42:30.966625] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629362 ] 00:06:04.641 [2024-05-14 11:42:31.565290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.641 [2024-05-14 11:42:31.669747] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.641 [2024-05-14 11:42:31.715828] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:04.641 [2024-05-14 11:42:31.723866] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:04.900 [2024-05-14 11:42:31.731882] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:04.900 [2024-05-14 11:42:31.813131] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:07.435 [2024-05-14 11:42:34.018186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:07.435 [2024-05-14 11:42:34.018246] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:07.435 [2024-05-14 11:42:34.018261] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:07.435 [2024-05-14 11:42:34.026203] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:07.435 [2024-05-14 11:42:34.026229] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:07.435 [2024-05-14 11:42:34.034215] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:07.435 [2024-05-14 11:42:34.034239] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:07.435 [2024-05-14 11:42:34.042249] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:07.435 [2024-05-14 11:42:34.042276] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:07.435 [2024-05-14 11:42:34.042288] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:07.435 [2024-05-14 11:42:34.417299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:07.435 [2024-05-14 11:42:34.417347] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:07.435 [2024-05-14 11:42:34.417367] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x181cab0 00:06:07.435 [2024-05-14 11:42:34.417379] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:07.435 [2024-05-14 11:42:34.417670] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:07.435 [2024-05-14 11:42:34.417689] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:07.694 11:42:34 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:07.694 11:42:34 json_config -- common/autotest_common.sh@860 -- # return 0 00:06:07.694 11:42:34 json_config -- json_config/common.sh@26 -- # echo '' 00:06:07.694 00:06:07.694 11:42:34 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:07.694 11:42:34 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:07.694 INFO: Checking if target configuration is the same... 00:06:07.694 11:42:34 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:07.694 11:42:34 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:07.694 11:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:07.694 + '[' 2 -ne 2 ']' 00:06:07.694 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:07.694 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:07.694 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:07.694 +++ basename /dev/fd/62 00:06:07.694 ++ mktemp /tmp/62.XXX 00:06:07.694 + tmp_file_1=/tmp/62.TBg 00:06:07.694 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:07.694 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:07.694 + tmp_file_2=/tmp/spdk_tgt_config.json.X85 00:06:07.694 + ret=0 00:06:07.694 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:07.953 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:07.953 + diff -u /tmp/62.TBg /tmp/spdk_tgt_config.json.X85 00:06:07.953 + echo 'INFO: JSON config files are the same' 00:06:07.953 INFO: JSON config files are the same 00:06:07.953 + rm /tmp/62.TBg /tmp/spdk_tgt_config.json.X85 00:06:07.953 + exit 0 00:06:07.953 11:42:34 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:07.953 11:42:34 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:07.953 INFO: changing configuration and checking if this can be detected... 00:06:07.953 11:42:34 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:07.953 11:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:08.212 11:42:35 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:08.212 11:42:35 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:08.212 11:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:08.212 + '[' 2 -ne 2 ']' 00:06:08.212 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:08.212 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:08.212 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:08.212 +++ basename /dev/fd/62 00:06:08.212 ++ mktemp /tmp/62.XXX 00:06:08.212 + tmp_file_1=/tmp/62.T9z 00:06:08.212 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:08.212 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:08.212 + tmp_file_2=/tmp/spdk_tgt_config.json.Qih 00:06:08.212 + ret=0 00:06:08.212 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:08.781 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:08.781 + diff -u /tmp/62.T9z /tmp/spdk_tgt_config.json.Qih 00:06:08.781 + ret=1 00:06:08.781 + echo '=== Start of file: /tmp/62.T9z ===' 00:06:08.781 + cat /tmp/62.T9z 00:06:08.781 + echo '=== End of file: /tmp/62.T9z ===' 00:06:08.781 + echo '' 00:06:08.781 + echo '=== Start of file: /tmp/spdk_tgt_config.json.Qih ===' 00:06:08.781 + cat /tmp/spdk_tgt_config.json.Qih 00:06:08.781 + echo '=== End of file: /tmp/spdk_tgt_config.json.Qih ===' 00:06:08.781 + echo '' 00:06:08.781 + rm /tmp/62.T9z /tmp/spdk_tgt_config.json.Qih 00:06:08.781 + exit 1 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:08.781 INFO: configuration change detected. 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:08.781 11:42:35 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:08.781 11:42:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@317 -- # [[ -n 1629362 ]] 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:08.781 11:42:35 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:08.781 11:42:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:08.781 11:42:35 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:08.781 11:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:09.040 11:42:35 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:09.040 11:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:09.299 11:42:36 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:09.299 11:42:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:09.299 11:42:36 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:09.299 11:42:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:09.557 11:42:36 json_config -- json_config/json_config.sh@323 -- # killprocess 1629362 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@946 -- # '[' -z 1629362 ']' 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@950 -- # kill -0 1629362 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@951 -- # uname 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:09.557 11:42:36 json_config -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1629362 00:06:09.816 11:42:36 json_config -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:09.816 11:42:36 json_config -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:09.816 11:42:36 json_config -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1629362' 00:06:09.816 killing process with pid 1629362 00:06:09.816 11:42:36 json_config -- common/autotest_common.sh@965 -- # kill 1629362 00:06:09.816 11:42:36 json_config -- common/autotest_common.sh@970 -- # wait 1629362 00:06:13.172 11:42:39 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:13.172 11:42:39 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:13.172 11:42:39 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:13.172 11:42:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:13.172 11:42:39 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:13.172 11:42:39 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:13.172 INFO: Success 00:06:13.172 00:06:13.172 real 0m27.761s 00:06:13.172 user 0m33.475s 00:06:13.172 sys 0m3.826s 00:06:13.172 11:42:39 json_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:13.172 11:42:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:13.172 ************************************ 00:06:13.172 END TEST json_config 00:06:13.172 ************************************ 00:06:13.172 11:42:39 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:13.172 11:42:39 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:13.172 11:42:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:13.172 11:42:39 -- common/autotest_common.sh@10 -- # set +x 00:06:13.172 ************************************ 00:06:13.172 START TEST json_config_extra_key 00:06:13.172 ************************************ 00:06:13.172 11:42:40 json_config_extra_key -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:13.172 11:42:40 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:13.172 11:42:40 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:13.172 11:42:40 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:13.172 11:42:40 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:13.172 11:42:40 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:13.172 11:42:40 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:13.172 11:42:40 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:13.172 11:42:40 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:13.172 11:42:40 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:13.172 INFO: launching applications... 00:06:13.172 11:42:40 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1630704 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:13.173 Waiting for target to run... 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1630704 /var/tmp/spdk_tgt.sock 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@827 -- # '[' -z 1630704 ']' 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:13.173 11:42:40 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:13.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:13.173 11:42:40 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:13.173 [2024-05-14 11:42:40.235277] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:13.173 [2024-05-14 11:42:40.235353] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630704 ] 00:06:14.109 [2024-05-14 11:42:40.834113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.109 [2024-05-14 11:42:40.935132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.109 11:42:41 json_config_extra_key -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:14.109 11:42:41 json_config_extra_key -- common/autotest_common.sh@860 -- # return 0 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:14.109 00:06:14.109 11:42:41 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:14.109 INFO: shutting down applications... 00:06:14.109 11:42:41 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1630704 ]] 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1630704 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1630704 00:06:14.109 11:42:41 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1630704 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:14.677 11:42:41 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:14.677 SPDK target shutdown done 00:06:14.677 11:42:41 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:14.677 Success 00:06:14.677 00:06:14.677 real 0m1.610s 00:06:14.677 user 0m1.079s 00:06:14.677 sys 0m0.716s 00:06:14.677 11:42:41 json_config_extra_key -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:14.677 11:42:41 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:14.677 ************************************ 00:06:14.677 END TEST json_config_extra_key 00:06:14.677 ************************************ 00:06:14.677 11:42:41 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:14.677 11:42:41 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:14.677 11:42:41 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:14.677 11:42:41 -- common/autotest_common.sh@10 -- # set +x 00:06:14.677 ************************************ 00:06:14.677 START TEST alias_rpc 00:06:14.677 ************************************ 00:06:14.677 11:42:41 alias_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:14.935 * Looking for test storage... 00:06:14.935 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:14.935 11:42:41 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:14.935 11:42:41 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1630930 00:06:14.935 11:42:41 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1630930 00:06:14.935 11:42:41 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@827 -- # '[' -z 1630930 ']' 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.935 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:14.935 11:42:41 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.935 [2024-05-14 11:42:41.931095] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:14.935 [2024-05-14 11:42:41.931169] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630930 ] 00:06:15.194 [2024-05-14 11:42:42.063701] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.194 [2024-05-14 11:42:42.168470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.128 11:42:42 alias_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:16.128 11:42:42 alias_rpc -- common/autotest_common.sh@860 -- # return 0 00:06:16.128 11:42:42 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:16.128 11:42:43 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1630930 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@946 -- # '[' -z 1630930 ']' 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@950 -- # kill -0 1630930 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@951 -- # uname 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1630930 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1630930' 00:06:16.128 killing process with pid 1630930 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@965 -- # kill 1630930 00:06:16.128 11:42:43 alias_rpc -- common/autotest_common.sh@970 -- # wait 1630930 00:06:16.696 00:06:16.696 real 0m1.785s 00:06:16.696 user 0m1.985s 00:06:16.696 sys 0m0.543s 00:06:16.696 11:42:43 alias_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.696 11:42:43 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.696 ************************************ 00:06:16.696 END TEST alias_rpc 00:06:16.696 ************************************ 00:06:16.696 11:42:43 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:06:16.696 11:42:43 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:16.696 11:42:43 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:16.696 11:42:43 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:16.696 11:42:43 -- common/autotest_common.sh@10 -- # set +x 00:06:16.696 ************************************ 00:06:16.696 START TEST spdkcli_tcp 00:06:16.696 ************************************ 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:16.696 * Looking for test storage... 00:06:16.696 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1631331 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1631331 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@827 -- # '[' -z 1631331 ']' 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:16.696 11:42:43 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:16.696 11:42:43 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:16.954 [2024-05-14 11:42:43.814951] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:16.954 [2024-05-14 11:42:43.815023] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631331 ] 00:06:16.954 [2024-05-14 11:42:43.943283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:17.213 [2024-05-14 11:42:44.048912] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:17.213 [2024-05-14 11:42:44.048917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.780 11:42:44 spdkcli_tcp -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:17.780 11:42:44 spdkcli_tcp -- common/autotest_common.sh@860 -- # return 0 00:06:17.780 11:42:44 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1631402 00:06:17.780 11:42:44 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:17.780 11:42:44 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:18.039 [ 00:06:18.039 "bdev_malloc_delete", 00:06:18.039 "bdev_malloc_create", 00:06:18.039 "bdev_null_resize", 00:06:18.039 "bdev_null_delete", 00:06:18.039 "bdev_null_create", 00:06:18.039 "bdev_nvme_cuse_unregister", 00:06:18.039 "bdev_nvme_cuse_register", 00:06:18.040 "bdev_opal_new_user", 00:06:18.040 "bdev_opal_set_lock_state", 00:06:18.040 "bdev_opal_delete", 00:06:18.040 "bdev_opal_get_info", 00:06:18.040 "bdev_opal_create", 00:06:18.040 "bdev_nvme_opal_revert", 00:06:18.040 "bdev_nvme_opal_init", 00:06:18.040 "bdev_nvme_send_cmd", 00:06:18.040 "bdev_nvme_get_path_iostat", 00:06:18.040 "bdev_nvme_get_mdns_discovery_info", 00:06:18.040 "bdev_nvme_stop_mdns_discovery", 00:06:18.040 "bdev_nvme_start_mdns_discovery", 00:06:18.040 "bdev_nvme_set_multipath_policy", 00:06:18.040 "bdev_nvme_set_preferred_path", 00:06:18.040 "bdev_nvme_get_io_paths", 00:06:18.040 "bdev_nvme_remove_error_injection", 00:06:18.040 "bdev_nvme_add_error_injection", 00:06:18.040 "bdev_nvme_get_discovery_info", 00:06:18.040 "bdev_nvme_stop_discovery", 00:06:18.040 "bdev_nvme_start_discovery", 00:06:18.040 "bdev_nvme_get_controller_health_info", 00:06:18.040 "bdev_nvme_disable_controller", 00:06:18.040 "bdev_nvme_enable_controller", 00:06:18.040 "bdev_nvme_reset_controller", 00:06:18.040 "bdev_nvme_get_transport_statistics", 00:06:18.040 "bdev_nvme_apply_firmware", 00:06:18.040 "bdev_nvme_detach_controller", 00:06:18.040 "bdev_nvme_get_controllers", 00:06:18.040 "bdev_nvme_attach_controller", 00:06:18.040 "bdev_nvme_set_hotplug", 00:06:18.040 "bdev_nvme_set_options", 00:06:18.040 "bdev_passthru_delete", 00:06:18.040 "bdev_passthru_create", 00:06:18.040 "bdev_lvol_check_shallow_copy", 00:06:18.040 "bdev_lvol_start_shallow_copy", 00:06:18.040 "bdev_lvol_grow_lvstore", 00:06:18.040 "bdev_lvol_get_lvols", 00:06:18.040 "bdev_lvol_get_lvstores", 00:06:18.040 "bdev_lvol_delete", 00:06:18.040 "bdev_lvol_set_read_only", 00:06:18.040 "bdev_lvol_resize", 00:06:18.040 "bdev_lvol_decouple_parent", 00:06:18.040 "bdev_lvol_inflate", 00:06:18.040 "bdev_lvol_rename", 00:06:18.040 "bdev_lvol_clone_bdev", 00:06:18.040 "bdev_lvol_clone", 00:06:18.040 "bdev_lvol_snapshot", 00:06:18.040 "bdev_lvol_create", 00:06:18.040 "bdev_lvol_delete_lvstore", 00:06:18.040 "bdev_lvol_rename_lvstore", 00:06:18.040 "bdev_lvol_create_lvstore", 00:06:18.040 "bdev_raid_set_options", 00:06:18.040 "bdev_raid_remove_base_bdev", 00:06:18.040 "bdev_raid_add_base_bdev", 00:06:18.040 "bdev_raid_delete", 00:06:18.040 "bdev_raid_create", 00:06:18.040 "bdev_raid_get_bdevs", 00:06:18.040 "bdev_error_inject_error", 00:06:18.040 "bdev_error_delete", 00:06:18.040 "bdev_error_create", 00:06:18.040 "bdev_split_delete", 00:06:18.040 "bdev_split_create", 00:06:18.040 "bdev_delay_delete", 00:06:18.040 "bdev_delay_create", 00:06:18.040 "bdev_delay_update_latency", 00:06:18.040 "bdev_zone_block_delete", 00:06:18.040 "bdev_zone_block_create", 00:06:18.040 "blobfs_create", 00:06:18.040 "blobfs_detect", 00:06:18.040 "blobfs_set_cache_size", 00:06:18.040 "bdev_crypto_delete", 00:06:18.040 "bdev_crypto_create", 00:06:18.040 "bdev_compress_delete", 00:06:18.040 "bdev_compress_create", 00:06:18.040 "bdev_compress_get_orphans", 00:06:18.040 "bdev_aio_delete", 00:06:18.040 "bdev_aio_rescan", 00:06:18.040 "bdev_aio_create", 00:06:18.040 "bdev_ftl_set_property", 00:06:18.040 "bdev_ftl_get_properties", 00:06:18.040 "bdev_ftl_get_stats", 00:06:18.040 "bdev_ftl_unmap", 00:06:18.040 "bdev_ftl_unload", 00:06:18.040 "bdev_ftl_delete", 00:06:18.040 "bdev_ftl_load", 00:06:18.040 "bdev_ftl_create", 00:06:18.040 "bdev_virtio_attach_controller", 00:06:18.040 "bdev_virtio_scsi_get_devices", 00:06:18.040 "bdev_virtio_detach_controller", 00:06:18.040 "bdev_virtio_blk_set_hotplug", 00:06:18.040 "bdev_iscsi_delete", 00:06:18.040 "bdev_iscsi_create", 00:06:18.040 "bdev_iscsi_set_options", 00:06:18.040 "accel_error_inject_error", 00:06:18.040 "ioat_scan_accel_module", 00:06:18.040 "dsa_scan_accel_module", 00:06:18.040 "iaa_scan_accel_module", 00:06:18.040 "dpdk_cryptodev_get_driver", 00:06:18.040 "dpdk_cryptodev_set_driver", 00:06:18.040 "dpdk_cryptodev_scan_accel_module", 00:06:18.040 "compressdev_scan_accel_module", 00:06:18.040 "keyring_file_remove_key", 00:06:18.040 "keyring_file_add_key", 00:06:18.040 "iscsi_get_histogram", 00:06:18.040 "iscsi_enable_histogram", 00:06:18.040 "iscsi_set_options", 00:06:18.040 "iscsi_get_auth_groups", 00:06:18.040 "iscsi_auth_group_remove_secret", 00:06:18.040 "iscsi_auth_group_add_secret", 00:06:18.040 "iscsi_delete_auth_group", 00:06:18.040 "iscsi_create_auth_group", 00:06:18.040 "iscsi_set_discovery_auth", 00:06:18.040 "iscsi_get_options", 00:06:18.040 "iscsi_target_node_request_logout", 00:06:18.040 "iscsi_target_node_set_redirect", 00:06:18.040 "iscsi_target_node_set_auth", 00:06:18.040 "iscsi_target_node_add_lun", 00:06:18.040 "iscsi_get_stats", 00:06:18.040 "iscsi_get_connections", 00:06:18.040 "iscsi_portal_group_set_auth", 00:06:18.040 "iscsi_start_portal_group", 00:06:18.040 "iscsi_delete_portal_group", 00:06:18.040 "iscsi_create_portal_group", 00:06:18.040 "iscsi_get_portal_groups", 00:06:18.040 "iscsi_delete_target_node", 00:06:18.040 "iscsi_target_node_remove_pg_ig_maps", 00:06:18.040 "iscsi_target_node_add_pg_ig_maps", 00:06:18.040 "iscsi_create_target_node", 00:06:18.040 "iscsi_get_target_nodes", 00:06:18.040 "iscsi_delete_initiator_group", 00:06:18.040 "iscsi_initiator_group_remove_initiators", 00:06:18.040 "iscsi_initiator_group_add_initiators", 00:06:18.040 "iscsi_create_initiator_group", 00:06:18.040 "iscsi_get_initiator_groups", 00:06:18.040 "nvmf_set_crdt", 00:06:18.040 "nvmf_set_config", 00:06:18.040 "nvmf_set_max_subsystems", 00:06:18.040 "nvmf_subsystem_get_listeners", 00:06:18.040 "nvmf_subsystem_get_qpairs", 00:06:18.040 "nvmf_subsystem_get_controllers", 00:06:18.040 "nvmf_get_stats", 00:06:18.040 "nvmf_get_transports", 00:06:18.040 "nvmf_create_transport", 00:06:18.040 "nvmf_get_targets", 00:06:18.040 "nvmf_delete_target", 00:06:18.040 "nvmf_create_target", 00:06:18.040 "nvmf_subsystem_allow_any_host", 00:06:18.040 "nvmf_subsystem_remove_host", 00:06:18.040 "nvmf_subsystem_add_host", 00:06:18.040 "nvmf_ns_remove_host", 00:06:18.040 "nvmf_ns_add_host", 00:06:18.040 "nvmf_subsystem_remove_ns", 00:06:18.040 "nvmf_subsystem_add_ns", 00:06:18.040 "nvmf_subsystem_listener_set_ana_state", 00:06:18.040 "nvmf_discovery_get_referrals", 00:06:18.040 "nvmf_discovery_remove_referral", 00:06:18.040 "nvmf_discovery_add_referral", 00:06:18.040 "nvmf_subsystem_remove_listener", 00:06:18.040 "nvmf_subsystem_add_listener", 00:06:18.040 "nvmf_delete_subsystem", 00:06:18.040 "nvmf_create_subsystem", 00:06:18.040 "nvmf_get_subsystems", 00:06:18.040 "env_dpdk_get_mem_stats", 00:06:18.040 "nbd_get_disks", 00:06:18.040 "nbd_stop_disk", 00:06:18.040 "nbd_start_disk", 00:06:18.040 "ublk_recover_disk", 00:06:18.040 "ublk_get_disks", 00:06:18.040 "ublk_stop_disk", 00:06:18.040 "ublk_start_disk", 00:06:18.040 "ublk_destroy_target", 00:06:18.040 "ublk_create_target", 00:06:18.040 "virtio_blk_create_transport", 00:06:18.040 "virtio_blk_get_transports", 00:06:18.040 "vhost_controller_set_coalescing", 00:06:18.040 "vhost_get_controllers", 00:06:18.040 "vhost_delete_controller", 00:06:18.040 "vhost_create_blk_controller", 00:06:18.040 "vhost_scsi_controller_remove_target", 00:06:18.040 "vhost_scsi_controller_add_target", 00:06:18.040 "vhost_start_scsi_controller", 00:06:18.040 "vhost_create_scsi_controller", 00:06:18.040 "thread_set_cpumask", 00:06:18.040 "framework_get_scheduler", 00:06:18.040 "framework_set_scheduler", 00:06:18.040 "framework_get_reactors", 00:06:18.040 "thread_get_io_channels", 00:06:18.040 "thread_get_pollers", 00:06:18.040 "thread_get_stats", 00:06:18.040 "framework_monitor_context_switch", 00:06:18.040 "spdk_kill_instance", 00:06:18.040 "log_enable_timestamps", 00:06:18.040 "log_get_flags", 00:06:18.040 "log_clear_flag", 00:06:18.040 "log_set_flag", 00:06:18.040 "log_get_level", 00:06:18.040 "log_set_level", 00:06:18.040 "log_get_print_level", 00:06:18.040 "log_set_print_level", 00:06:18.040 "framework_enable_cpumask_locks", 00:06:18.040 "framework_disable_cpumask_locks", 00:06:18.040 "framework_wait_init", 00:06:18.040 "framework_start_init", 00:06:18.040 "scsi_get_devices", 00:06:18.040 "bdev_get_histogram", 00:06:18.040 "bdev_enable_histogram", 00:06:18.040 "bdev_set_qos_limit", 00:06:18.040 "bdev_set_qd_sampling_period", 00:06:18.040 "bdev_get_bdevs", 00:06:18.040 "bdev_reset_iostat", 00:06:18.040 "bdev_get_iostat", 00:06:18.040 "bdev_examine", 00:06:18.040 "bdev_wait_for_examine", 00:06:18.040 "bdev_set_options", 00:06:18.040 "notify_get_notifications", 00:06:18.040 "notify_get_types", 00:06:18.040 "accel_get_stats", 00:06:18.040 "accel_set_options", 00:06:18.040 "accel_set_driver", 00:06:18.040 "accel_crypto_key_destroy", 00:06:18.040 "accel_crypto_keys_get", 00:06:18.040 "accel_crypto_key_create", 00:06:18.040 "accel_assign_opc", 00:06:18.040 "accel_get_module_info", 00:06:18.040 "accel_get_opc_assignments", 00:06:18.040 "vmd_rescan", 00:06:18.040 "vmd_remove_device", 00:06:18.040 "vmd_enable", 00:06:18.040 "sock_get_default_impl", 00:06:18.040 "sock_set_default_impl", 00:06:18.040 "sock_impl_set_options", 00:06:18.040 "sock_impl_get_options", 00:06:18.040 "iobuf_get_stats", 00:06:18.040 "iobuf_set_options", 00:06:18.040 "framework_get_pci_devices", 00:06:18.040 "framework_get_config", 00:06:18.040 "framework_get_subsystems", 00:06:18.040 "trace_get_info", 00:06:18.040 "trace_get_tpoint_group_mask", 00:06:18.040 "trace_disable_tpoint_group", 00:06:18.040 "trace_enable_tpoint_group", 00:06:18.040 "trace_clear_tpoint_mask", 00:06:18.040 "trace_set_tpoint_mask", 00:06:18.041 "keyring_get_keys", 00:06:18.041 "spdk_get_version", 00:06:18.041 "rpc_get_methods" 00:06:18.041 ] 00:06:18.041 11:42:44 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:18.041 11:42:44 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:18.041 11:42:44 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:18.041 11:42:45 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:18.041 11:42:45 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1631331 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@946 -- # '[' -z 1631331 ']' 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@950 -- # kill -0 1631331 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@951 -- # uname 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1631331 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1631331' 00:06:18.041 killing process with pid 1631331 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@965 -- # kill 1631331 00:06:18.041 11:42:45 spdkcli_tcp -- common/autotest_common.sh@970 -- # wait 1631331 00:06:18.609 00:06:18.609 real 0m1.831s 00:06:18.609 user 0m3.308s 00:06:18.609 sys 0m0.609s 00:06:18.609 11:42:45 spdkcli_tcp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:18.609 11:42:45 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:18.609 ************************************ 00:06:18.609 END TEST spdkcli_tcp 00:06:18.609 ************************************ 00:06:18.609 11:42:45 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:18.609 11:42:45 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:18.609 11:42:45 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:18.609 11:42:45 -- common/autotest_common.sh@10 -- # set +x 00:06:18.609 ************************************ 00:06:18.609 START TEST dpdk_mem_utility 00:06:18.609 ************************************ 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:18.609 * Looking for test storage... 00:06:18.609 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:18.609 11:42:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:18.609 11:42:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1631596 00:06:18.609 11:42:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1631596 00:06:18.609 11:42:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@827 -- # '[' -z 1631596 ']' 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.609 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:18.609 11:42:45 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:18.868 [2024-05-14 11:42:45.728352] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:18.868 [2024-05-14 11:42:45.728433] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631596 ] 00:06:18.868 [2024-05-14 11:42:45.858698] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.128 [2024-05-14 11:42:45.956793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.695 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:19.695 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@860 -- # return 0 00:06:19.695 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:19.695 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:19.695 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:19.695 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:19.695 { 00:06:19.695 "filename": "/tmp/spdk_mem_dump.txt" 00:06:19.695 } 00:06:19.695 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:19.695 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:19.695 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:19.695 2 heaps totaling size 816.000000 MiB 00:06:19.695 size: 814.000000 MiB heap id: 0 00:06:19.695 size: 2.000000 MiB heap id: 1 00:06:19.695 end heaps---------- 00:06:19.695 8 mempools totaling size 598.116089 MiB 00:06:19.695 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:19.695 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:19.695 size: 84.521057 MiB name: bdev_io_1631596 00:06:19.695 size: 51.011292 MiB name: evtpool_1631596 00:06:19.695 size: 50.003479 MiB name: msgpool_1631596 00:06:19.695 size: 21.763794 MiB name: PDU_Pool 00:06:19.695 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:19.695 size: 0.026123 MiB name: Session_Pool 00:06:19.695 end mempools------- 00:06:19.695 201 memzones totaling size 4.176453 MiB 00:06:19.695 size: 1.000366 MiB name: RG_ring_0_1631596 00:06:19.695 size: 1.000366 MiB name: RG_ring_1_1631596 00:06:19.695 size: 1.000366 MiB name: RG_ring_4_1631596 00:06:19.695 size: 1.000366 MiB name: RG_ring_5_1631596 00:06:19.695 size: 0.125366 MiB name: RG_ring_2_1631596 00:06:19.695 size: 0.015991 MiB name: RG_ring_3_1631596 00:06:19.695 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:19.695 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:19.695 size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:19.696 size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:19.696 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:19.696 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:19.696 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:19.697 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:19.697 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:19.697 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:19.697 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:19.697 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:19.697 end memzones------- 00:06:19.697 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:19.959 heap id: 0 total size: 814.000000 MiB number of busy elements: 546 number of free elements: 14 00:06:19.959 list of free elements. size: 11.809875 MiB 00:06:19.959 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:19.959 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:19.959 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:19.959 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:19.959 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:19.959 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:19.959 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:19.959 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:19.959 element at address: 0x20001aa00000 with size: 0.580505 MiB 00:06:19.959 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:19.959 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:19.959 element at address: 0x200000800000 with size: 0.486328 MiB 00:06:19.959 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:19.959 element at address: 0x200027e00000 with size: 0.401245 MiB 00:06:19.959 list of standard malloc elements. size: 199.881836 MiB 00:06:19.959 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:19.959 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:19.959 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:19.959 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:19.959 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:19.959 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:19.959 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:19.959 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:19.959 element at address: 0x200000330b40 with size: 0.004395 MiB 00:06:19.959 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000337640 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000033e140 with size: 0.004395 MiB 00:06:19.959 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000344c40 with size: 0.004395 MiB 00:06:19.959 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000034b740 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000352240 with size: 0.004395 MiB 00:06:19.959 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000358d40 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000035f840 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:19.959 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:19.959 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:19.959 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:19.960 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:19.960 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:19.960 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:19.960 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:19.960 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:19.960 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:19.960 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:19.960 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:19.960 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000333040 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000335540 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000339b40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000033c040 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000340640 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000342b40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000347140 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000349640 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000350140 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000354740 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000356c40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000035b240 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000035d740 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:19.960 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:19.960 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:19.960 element at address: 0x200000204ec0 with size: 0.000305 MiB 00:06:19.960 element at address: 0x200000200000 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200180 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200240 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200300 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200480 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200540 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200600 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200780 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200840 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200900 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200a80 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200b40 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200c00 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200d80 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200e40 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200f00 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201080 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201140 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201200 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201380 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201440 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201500 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201680 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201740 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201800 with size: 0.000183 MiB 00:06:19.960 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201980 with size: 0.000183 MiB 00:06:19.960 element at address: 0x200000201a40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201b00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201c80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201d40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201e00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000201f80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202040 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202100 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202280 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202340 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202400 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202580 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202640 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202700 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202880 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202940 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202a00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202b80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202c40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202d00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202e80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000202f40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203000 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203180 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203240 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203300 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203480 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203540 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203600 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203780 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203840 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203900 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203a80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203b40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203c00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203d80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203e40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203f00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204080 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204140 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204200 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204380 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204440 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204500 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204680 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204740 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204800 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204980 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204a40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204b00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204c80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204d40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000204e00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205000 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205180 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205240 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205300 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205480 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205540 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205600 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205780 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205840 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205900 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205a80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205b40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205c00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205d80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205e40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205f00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000206080 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000206140 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000206200 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000020a780 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022af80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b040 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b100 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b280 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b340 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b400 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b580 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b640 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b700 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b900 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022be40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c080 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c140 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c200 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c380 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c440 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000022c500 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000032e700 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000331d40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000338840 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000033f340 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000345e40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000034c940 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000353440 with size: 0.000183 MiB 00:06:19.961 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000359f40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000360a40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:19.961 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:19.961 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087c800 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:19.962 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:19.962 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200027e66b80 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200027e66c40 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200027e6d840 with size: 0.000183 MiB 00:06:19.962 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:19.963 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:19.963 list of memzone associated elements. size: 602.308289 MiB 00:06:19.963 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:19.963 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:19.963 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:19.963 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:19.963 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:19.963 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1631596_0 00:06:19.963 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:19.963 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1631596_0 00:06:19.963 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:19.963 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1631596_0 00:06:19.963 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:19.963 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:19.963 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:19.963 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:19.963 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:19.963 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1631596 00:06:19.963 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:19.963 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1631596 00:06:19.963 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:06:19.963 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1631596 00:06:19.963 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:19.963 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:19.963 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:19.963 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:19.963 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:19.963 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:19.963 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:19.963 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:19.963 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:19.963 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1631596 00:06:19.963 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:19.963 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1631596 00:06:19.963 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:19.963 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1631596 00:06:19.963 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:19.963 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1631596 00:06:19.963 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:19.963 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1631596 00:06:19.963 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:19.963 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:19.963 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:19.963 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:19.963 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:19.963 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:19.963 element at address: 0x20000020a840 with size: 0.125488 MiB 00:06:19.963 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1631596 00:06:19.963 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:19.963 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:19.963 element at address: 0x200027e66d00 with size: 0.023743 MiB 00:06:19.963 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:19.963 element at address: 0x200000206580 with size: 0.016113 MiB 00:06:19.963 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1631596 00:06:19.963 element at address: 0x200027e6ce40 with size: 0.002441 MiB 00:06:19.963 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:19.963 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:19.963 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:19.963 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:19.963 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:19.963 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:19.963 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:19.963 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:19.963 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:19.963 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:19.963 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:19.963 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:19.963 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:19.963 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:19.963 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:19.963 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:19.963 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:19.963 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:19.963 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:19.963 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:19.963 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:19.963 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:19.964 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:19.964 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:19.964 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:19.964 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:19.964 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:19.964 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:19.964 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:19.964 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:19.964 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:19.964 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:19.964 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:19.964 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:19.964 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:19.964 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:19.964 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:19.964 element at address: 0x20000035d580 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:19.964 element at address: 0x20000035a000 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:19.964 element at address: 0x200000356a80 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:19.964 element at address: 0x200000353500 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:19.964 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:19.964 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:19.964 element at address: 0x200000349480 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:19.964 element at address: 0x200000345f00 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:19.964 element at address: 0x200000342980 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:19.964 element at address: 0x20000033f400 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:19.964 element at address: 0x20000033be80 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:19.964 element at address: 0x200000338900 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:19.964 element at address: 0x200000335380 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:19.964 element at address: 0x200000331e00 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:19.964 element at address: 0x20000032e880 with size: 0.000427 MiB 00:06:19.964 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:19.964 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:19.964 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:19.964 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:06:19.964 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1631596 00:06:19.964 element at address: 0x200000206380 with size: 0.000305 MiB 00:06:19.964 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1631596 00:06:19.964 element at address: 0x200027e6d900 with size: 0.000305 MiB 00:06:19.964 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:19.964 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:19.964 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:19.964 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:19.964 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:19.964 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:19.964 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:19.964 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:19.964 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:19.964 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:19.964 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:19.964 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:19.964 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:19.964 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:19.964 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:19.964 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:19.964 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:19.964 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:19.964 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:19.964 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:19.964 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:19.964 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:19.965 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:19.965 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:19.965 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:19.965 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:19.965 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:19.965 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:19.965 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:19.965 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:19.965 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:19.965 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:19.965 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:19.965 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:19.965 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:19.965 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:19.965 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:19.965 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:19.965 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:19.965 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:19.965 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:19.965 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:19.965 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:19.965 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:19.965 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:19.965 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:19.965 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:19.965 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:19.965 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:19.965 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:19.965 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:19.965 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:19.965 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:19.965 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:19.965 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:19.965 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:19.965 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:19.965 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:19.965 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:19.965 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:19.965 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:19.965 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:19.965 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:19.965 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:19.965 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:19.965 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:19.965 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:19.965 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:19.965 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:19.965 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:19.965 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:19.965 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:19.965 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:19.965 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:19.965 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:19.965 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:19.965 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:19.965 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:19.965 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:19.965 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:19.965 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:19.965 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:19.965 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:19.965 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:19.965 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:19.965 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:19.965 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:19.965 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:19.965 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:19.965 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:19.965 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:19.965 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:19.965 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:19.965 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:19.966 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:19.966 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:19.966 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:19.966 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:19.966 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:19.966 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:19.966 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:19.966 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:19.966 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:19.966 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:19.966 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:19.966 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:19.966 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:19.966 11:42:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1631596 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@946 -- # '[' -z 1631596 ']' 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@950 -- # kill -0 1631596 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@951 -- # uname 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1631596 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1631596' 00:06:19.966 killing process with pid 1631596 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@965 -- # kill 1631596 00:06:19.966 11:42:46 dpdk_mem_utility -- common/autotest_common.sh@970 -- # wait 1631596 00:06:20.225 00:06:20.225 real 0m1.755s 00:06:20.225 user 0m1.925s 00:06:20.225 sys 0m0.559s 00:06:20.225 11:42:47 dpdk_mem_utility -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:20.225 11:42:47 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:20.225 ************************************ 00:06:20.225 END TEST dpdk_mem_utility 00:06:20.225 ************************************ 00:06:20.483 11:42:47 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:20.483 11:42:47 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:20.483 11:42:47 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:20.483 11:42:47 -- common/autotest_common.sh@10 -- # set +x 00:06:20.483 ************************************ 00:06:20.483 START TEST event 00:06:20.483 ************************************ 00:06:20.483 11:42:47 event -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:20.483 * Looking for test storage... 00:06:20.483 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:20.483 11:42:47 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:20.483 11:42:47 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:20.483 11:42:47 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:20.483 11:42:47 event -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:06:20.483 11:42:47 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:20.484 11:42:47 event -- common/autotest_common.sh@10 -- # set +x 00:06:20.484 ************************************ 00:06:20.484 START TEST event_perf 00:06:20.484 ************************************ 00:06:20.484 11:42:47 event.event_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:20.742 Running I/O for 1 seconds...[2024-05-14 11:42:47.576076] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:20.742 [2024-05-14 11:42:47.576139] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631943 ] 00:06:20.742 [2024-05-14 11:42:47.704201] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:20.742 [2024-05-14 11:42:47.810240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.742 [2024-05-14 11:42:47.810342] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:20.742 [2024-05-14 11:42:47.810444] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:20.742 [2024-05-14 11:42:47.810446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.119 Running I/O for 1 seconds... 00:06:22.119 lcore 0: 163222 00:06:22.119 lcore 1: 163221 00:06:22.119 lcore 2: 163222 00:06:22.119 lcore 3: 163222 00:06:22.119 done. 00:06:22.119 00:06:22.119 real 0m1.359s 00:06:22.119 user 0m4.210s 00:06:22.119 sys 0m0.141s 00:06:22.119 11:42:48 event.event_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:22.119 11:42:48 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:22.119 ************************************ 00:06:22.119 END TEST event_perf 00:06:22.119 ************************************ 00:06:22.119 11:42:48 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:22.119 11:42:48 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:22.119 11:42:48 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:22.119 11:42:48 event -- common/autotest_common.sh@10 -- # set +x 00:06:22.119 ************************************ 00:06:22.119 START TEST event_reactor 00:06:22.119 ************************************ 00:06:22.119 11:42:48 event.event_reactor -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:22.119 [2024-05-14 11:42:49.010895] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:22.119 [2024-05-14 11:42:49.010953] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1632178 ] 00:06:22.119 [2024-05-14 11:42:49.138361] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.377 [2024-05-14 11:42:49.241226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.312 test_start 00:06:23.312 oneshot 00:06:23.312 tick 100 00:06:23.312 tick 100 00:06:23.312 tick 250 00:06:23.312 tick 100 00:06:23.312 tick 100 00:06:23.312 tick 100 00:06:23.312 tick 250 00:06:23.312 tick 500 00:06:23.312 tick 100 00:06:23.312 tick 100 00:06:23.312 tick 250 00:06:23.312 tick 100 00:06:23.312 tick 100 00:06:23.312 test_end 00:06:23.312 00:06:23.312 real 0m1.351s 00:06:23.312 user 0m1.219s 00:06:23.312 sys 0m0.126s 00:06:23.312 11:42:50 event.event_reactor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:23.312 11:42:50 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:23.312 ************************************ 00:06:23.312 END TEST event_reactor 00:06:23.312 ************************************ 00:06:23.312 11:42:50 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:23.312 11:42:50 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:23.312 11:42:50 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:23.312 11:42:50 event -- common/autotest_common.sh@10 -- # set +x 00:06:23.571 ************************************ 00:06:23.571 START TEST event_reactor_perf 00:06:23.571 ************************************ 00:06:23.571 11:42:50 event.event_reactor_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:23.571 [2024-05-14 11:42:50.453492] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:23.571 [2024-05-14 11:42:50.453552] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1632397 ] 00:06:23.571 [2024-05-14 11:42:50.582192] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.830 [2024-05-14 11:42:50.684463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.766 test_start 00:06:24.766 test_end 00:06:24.766 Performance: 324305 events per second 00:06:24.766 00:06:24.766 real 0m1.352s 00:06:24.766 user 0m1.202s 00:06:24.766 sys 0m0.142s 00:06:24.766 11:42:51 event.event_reactor_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:24.766 11:42:51 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:24.766 ************************************ 00:06:24.766 END TEST event_reactor_perf 00:06:24.766 ************************************ 00:06:24.766 11:42:51 event -- event/event.sh@49 -- # uname -s 00:06:24.766 11:42:51 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:24.766 11:42:51 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:24.766 11:42:51 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:24.766 11:42:51 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:24.766 11:42:51 event -- common/autotest_common.sh@10 -- # set +x 00:06:25.025 ************************************ 00:06:25.025 START TEST event_scheduler 00:06:25.025 ************************************ 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:25.025 * Looking for test storage... 00:06:25.025 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:25.025 11:42:51 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:25.025 11:42:51 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1632615 00:06:25.025 11:42:51 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:25.025 11:42:51 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1632615 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@827 -- # '[' -z 1632615 ']' 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.025 11:42:51 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.025 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:25.025 11:42:51 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:25.025 [2024-05-14 11:42:52.046020] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:25.025 [2024-05-14 11:42:52.046089] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1632615 ] 00:06:25.284 [2024-05-14 11:42:52.175644] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:25.284 [2024-05-14 11:42:52.280825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.284 [2024-05-14 11:42:52.280854] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.284 [2024-05-14 11:42:52.280955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:25.284 [2024-05-14 11:42:52.280957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:26.218 11:42:52 event.event_scheduler -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:26.218 11:42:52 event.event_scheduler -- common/autotest_common.sh@860 -- # return 0 00:06:26.218 11:42:52 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:26.218 11:42:52 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.218 11:42:52 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 POWER: Env isn't set yet! 00:06:26.218 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:26.218 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:26.218 POWER: Cannot set governor of lcore 0 to userspace 00:06:26.218 POWER: Attempting to initialise PSTAT power management... 00:06:26.218 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:26.218 POWER: Initialized successfully for lcore 0 power management 00:06:26.218 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:26.218 POWER: Initialized successfully for lcore 1 power management 00:06:26.218 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:26.218 POWER: Initialized successfully for lcore 2 power management 00:06:26.218 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:26.218 POWER: Initialized successfully for lcore 3 power management 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.218 11:42:53 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 [2024-05-14 11:42:53.147353] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.218 11:42:53 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:26.218 11:42:53 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 ************************************ 00:06:26.218 START TEST scheduler_create_thread 00:06:26.218 ************************************ 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1121 -- # scheduler_create_thread 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 2 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 3 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.218 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.218 4 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 5 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 6 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 7 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 8 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 9 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 10 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:26.219 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.478 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.776 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.776 11:42:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:26.776 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.776 11:42:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:28.676 11:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.676 11:42:55 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:28.676 11:42:55 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:28.676 11:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.676 11:42:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.241 11:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:29.241 00:06:29.241 real 0m3.104s 00:06:29.241 user 0m0.023s 00:06:29.241 sys 0m0.008s 00:06:29.241 11:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:29.241 11:42:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.241 ************************************ 00:06:29.241 END TEST scheduler_create_thread 00:06:29.241 ************************************ 00:06:29.499 11:42:56 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:29.499 11:42:56 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1632615 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@946 -- # '[' -z 1632615 ']' 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@950 -- # kill -0 1632615 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@951 -- # uname 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1632615 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@952 -- # process_name=reactor_2 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@956 -- # '[' reactor_2 = sudo ']' 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1632615' 00:06:29.499 killing process with pid 1632615 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@965 -- # kill 1632615 00:06:29.499 11:42:56 event.event_scheduler -- common/autotest_common.sh@970 -- # wait 1632615 00:06:29.757 [2024-05-14 11:42:56.683262] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:29.757 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:29.757 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:29.757 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:29.757 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:29.757 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:29.757 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:29.757 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:29.757 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:30.015 00:06:30.015 real 0m5.048s 00:06:30.015 user 0m9.810s 00:06:30.015 sys 0m0.528s 00:06:30.015 11:42:56 event.event_scheduler -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:30.015 11:42:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:30.015 ************************************ 00:06:30.015 END TEST event_scheduler 00:06:30.015 ************************************ 00:06:30.015 11:42:56 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:30.015 11:42:56 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:30.015 11:42:56 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:30.015 11:42:56 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:30.015 11:42:56 event -- common/autotest_common.sh@10 -- # set +x 00:06:30.015 ************************************ 00:06:30.015 START TEST app_repeat 00:06:30.015 ************************************ 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@1121 -- # app_repeat_test 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1633332 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1633332' 00:06:30.015 Process app_repeat pid: 1633332 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:30.015 spdk_app_start Round 0 00:06:30.015 11:42:57 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1633332 /var/tmp/spdk-nbd.sock 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 1633332 ']' 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:30.015 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:30.015 11:42:57 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:30.015 [2024-05-14 11:42:57.075495] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:30.015 [2024-05-14 11:42:57.075567] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1633332 ] 00:06:30.274 [2024-05-14 11:42:57.208607] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:30.274 [2024-05-14 11:42:57.316422] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.274 [2024-05-14 11:42:57.316429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.209 11:42:57 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:31.209 11:42:57 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:06:31.209 11:42:57 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.209 Malloc0 00:06:31.209 11:42:58 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.466 Malloc1 00:06:31.466 11:42:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.466 11:42:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:31.724 /dev/nbd0 00:06:31.724 11:42:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:31.724 11:42:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.724 1+0 records in 00:06:31.724 1+0 records out 00:06:31.724 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253237 s, 16.2 MB/s 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:31.724 11:42:58 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:31.724 11:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.724 11:42:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.724 11:42:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:31.982 /dev/nbd1 00:06:31.982 11:42:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:31.982 11:42:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:31.982 11:42:58 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:31.982 11:42:58 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:31.982 11:42:58 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:31.982 11:42:58 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:31.982 11:42:58 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.982 1+0 records in 00:06:31.982 1+0 records out 00:06:31.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252225 s, 16.2 MB/s 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:31.982 11:42:59 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:31.982 11:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.982 11:42:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.982 11:42:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:31.982 11:42:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.982 11:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:32.240 { 00:06:32.240 "nbd_device": "/dev/nbd0", 00:06:32.240 "bdev_name": "Malloc0" 00:06:32.240 }, 00:06:32.240 { 00:06:32.240 "nbd_device": "/dev/nbd1", 00:06:32.240 "bdev_name": "Malloc1" 00:06:32.240 } 00:06:32.240 ]' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:32.240 { 00:06:32.240 "nbd_device": "/dev/nbd0", 00:06:32.240 "bdev_name": "Malloc0" 00:06:32.240 }, 00:06:32.240 { 00:06:32.240 "nbd_device": "/dev/nbd1", 00:06:32.240 "bdev_name": "Malloc1" 00:06:32.240 } 00:06:32.240 ]' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:32.240 /dev/nbd1' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:32.240 /dev/nbd1' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:32.240 256+0 records in 00:06:32.240 256+0 records out 00:06:32.240 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108544 s, 96.6 MB/s 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:32.240 256+0 records in 00:06:32.240 256+0 records out 00:06:32.240 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.030077 s, 34.9 MB/s 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:32.240 11:42:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:32.498 256+0 records in 00:06:32.498 256+0 records out 00:06:32.498 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0313566 s, 33.4 MB/s 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.498 11:42:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.756 11:42:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:33.014 11:42:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:33.272 11:43:00 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:33.273 11:43:00 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:33.530 11:43:00 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:33.788 [2024-05-14 11:43:00.620604] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:33.788 [2024-05-14 11:43:00.719600] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:33.788 [2024-05-14 11:43:00.719604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.788 [2024-05-14 11:43:00.772982] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:33.788 [2024-05-14 11:43:00.773034] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:36.319 11:43:03 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:36.319 11:43:03 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:36.319 spdk_app_start Round 1 00:06:36.319 11:43:03 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1633332 /var/tmp/spdk-nbd.sock 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 1633332 ']' 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:36.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:36.319 11:43:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:36.577 11:43:03 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:36.577 11:43:03 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:06:36.577 11:43:03 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:36.835 Malloc0 00:06:36.835 11:43:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:37.094 Malloc1 00:06:37.094 11:43:04 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:37.094 11:43:04 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:37.352 /dev/nbd0 00:06:37.352 11:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:37.352 11:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:37.352 1+0 records in 00:06:37.352 1+0 records out 00:06:37.352 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238453 s, 17.2 MB/s 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:37.352 11:43:04 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:37.352 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:37.352 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:37.352 11:43:04 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:37.610 /dev/nbd1 00:06:37.610 11:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:37.610 11:43:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:37.610 1+0 records in 00:06:37.610 1+0 records out 00:06:37.610 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274618 s, 14.9 MB/s 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:37.610 11:43:04 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:37.611 11:43:04 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:37.611 11:43:04 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:37.611 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:37.611 11:43:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:37.611 11:43:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:37.611 11:43:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.611 11:43:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:37.869 11:43:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:37.869 { 00:06:37.869 "nbd_device": "/dev/nbd0", 00:06:37.870 "bdev_name": "Malloc0" 00:06:37.870 }, 00:06:37.870 { 00:06:37.870 "nbd_device": "/dev/nbd1", 00:06:37.870 "bdev_name": "Malloc1" 00:06:37.870 } 00:06:37.870 ]' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:37.870 { 00:06:37.870 "nbd_device": "/dev/nbd0", 00:06:37.870 "bdev_name": "Malloc0" 00:06:37.870 }, 00:06:37.870 { 00:06:37.870 "nbd_device": "/dev/nbd1", 00:06:37.870 "bdev_name": "Malloc1" 00:06:37.870 } 00:06:37.870 ]' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:37.870 /dev/nbd1' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:37.870 /dev/nbd1' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:37.870 256+0 records in 00:06:37.870 256+0 records out 00:06:37.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109822 s, 95.5 MB/s 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:37.870 11:43:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:38.129 256+0 records in 00:06:38.129 256+0 records out 00:06:38.129 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0288778 s, 36.3 MB/s 00:06:38.129 11:43:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:38.129 11:43:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:38.129 256+0 records in 00:06:38.129 256+0 records out 00:06:38.129 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0311351 s, 33.7 MB/s 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:38.129 11:43:05 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:38.388 11:43:05 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:38.646 11:43:05 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:38.647 11:43:05 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:38.647 11:43:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:38.647 11:43:05 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:38.647 11:43:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:38.905 11:43:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:38.905 11:43:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:39.164 11:43:06 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:39.423 [2024-05-14 11:43:06.372441] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:39.423 [2024-05-14 11:43:06.471130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:39.423 [2024-05-14 11:43:06.471134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.681 [2024-05-14 11:43:06.525345] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:39.681 [2024-05-14 11:43:06.525396] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:42.210 11:43:09 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:42.210 11:43:09 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:42.210 spdk_app_start Round 2 00:06:42.210 11:43:09 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1633332 /var/tmp/spdk-nbd.sock 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 1633332 ']' 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:42.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:42.210 11:43:09 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:42.468 11:43:09 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:42.468 11:43:09 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:06:42.468 11:43:09 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:42.726 Malloc0 00:06:42.726 11:43:09 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:42.985 Malloc1 00:06:42.985 11:43:09 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:42.985 11:43:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:43.269 /dev/nbd0 00:06:43.269 11:43:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:43.269 11:43:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:43.269 1+0 records in 00:06:43.269 1+0 records out 00:06:43.269 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234356 s, 17.5 MB/s 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:43.269 11:43:10 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:43.269 11:43:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:43.269 11:43:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:43.269 11:43:10 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:43.565 /dev/nbd1 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:43.565 1+0 records in 00:06:43.565 1+0 records out 00:06:43.565 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302487 s, 13.5 MB/s 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:43.565 11:43:10 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:43.565 11:43:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:43.823 11:43:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:43.823 { 00:06:43.823 "nbd_device": "/dev/nbd0", 00:06:43.823 "bdev_name": "Malloc0" 00:06:43.823 }, 00:06:43.823 { 00:06:43.823 "nbd_device": "/dev/nbd1", 00:06:43.823 "bdev_name": "Malloc1" 00:06:43.823 } 00:06:43.823 ]' 00:06:43.823 11:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:43.823 { 00:06:43.823 "nbd_device": "/dev/nbd0", 00:06:43.823 "bdev_name": "Malloc0" 00:06:43.823 }, 00:06:43.823 { 00:06:43.823 "nbd_device": "/dev/nbd1", 00:06:43.823 "bdev_name": "Malloc1" 00:06:43.823 } 00:06:43.823 ]' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:43.824 /dev/nbd1' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:43.824 /dev/nbd1' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:43.824 256+0 records in 00:06:43.824 256+0 records out 00:06:43.824 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109098 s, 96.1 MB/s 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:43.824 256+0 records in 00:06:43.824 256+0 records out 00:06:43.824 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301971 s, 34.7 MB/s 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:43.824 256+0 records in 00:06:43.824 256+0 records out 00:06:43.824 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0313467 s, 33.5 MB/s 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:43.824 11:43:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:44.083 11:43:11 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:44.341 11:43:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:44.341 11:43:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:44.341 11:43:11 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:44.341 11:43:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.341 11:43:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:44.342 11:43:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:44.600 11:43:11 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:44.600 11:43:11 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:44.858 11:43:11 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:45.116 [2024-05-14 11:43:12.134365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:45.374 [2024-05-14 11:43:12.234006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:45.374 [2024-05-14 11:43:12.234011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.374 [2024-05-14 11:43:12.286868] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:45.374 [2024-05-14 11:43:12.286922] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:47.903 11:43:14 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1633332 /var/tmp/spdk-nbd.sock 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 1633332 ']' 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:47.903 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:47.903 11:43:14 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:06:48.162 11:43:15 event.app_repeat -- event/event.sh@39 -- # killprocess 1633332 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@946 -- # '[' -z 1633332 ']' 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@950 -- # kill -0 1633332 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@951 -- # uname 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1633332 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1633332' 00:06:48.162 killing process with pid 1633332 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@965 -- # kill 1633332 00:06:48.162 11:43:15 event.app_repeat -- common/autotest_common.sh@970 -- # wait 1633332 00:06:48.421 spdk_app_start is called in Round 0. 00:06:48.421 Shutdown signal received, stop current app iteration 00:06:48.421 Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 reinitialization... 00:06:48.421 spdk_app_start is called in Round 1. 00:06:48.421 Shutdown signal received, stop current app iteration 00:06:48.421 Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 reinitialization... 00:06:48.421 spdk_app_start is called in Round 2. 00:06:48.421 Shutdown signal received, stop current app iteration 00:06:48.421 Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 reinitialization... 00:06:48.421 spdk_app_start is called in Round 3. 00:06:48.421 Shutdown signal received, stop current app iteration 00:06:48.421 11:43:15 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:48.421 11:43:15 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:48.421 00:06:48.421 real 0m18.372s 00:06:48.421 user 0m39.592s 00:06:48.421 sys 0m3.762s 00:06:48.421 11:43:15 event.app_repeat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:48.421 11:43:15 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:48.421 ************************************ 00:06:48.421 END TEST app_repeat 00:06:48.421 ************************************ 00:06:48.421 11:43:15 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:48.421 00:06:48.421 real 0m28.058s 00:06:48.421 user 0m56.220s 00:06:48.421 sys 0m5.108s 00:06:48.421 11:43:15 event -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:48.421 11:43:15 event -- common/autotest_common.sh@10 -- # set +x 00:06:48.421 ************************************ 00:06:48.421 END TEST event 00:06:48.421 ************************************ 00:06:48.421 11:43:15 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:48.421 11:43:15 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:48.421 11:43:15 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:48.421 11:43:15 -- common/autotest_common.sh@10 -- # set +x 00:06:48.680 ************************************ 00:06:48.680 START TEST thread 00:06:48.680 ************************************ 00:06:48.680 11:43:15 thread -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:48.680 * Looking for test storage... 00:06:48.680 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:48.680 11:43:15 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:48.680 11:43:15 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:06:48.680 11:43:15 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:48.680 11:43:15 thread -- common/autotest_common.sh@10 -- # set +x 00:06:48.680 ************************************ 00:06:48.680 START TEST thread_poller_perf 00:06:48.680 ************************************ 00:06:48.680 11:43:15 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:48.680 [2024-05-14 11:43:15.712948] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:48.680 [2024-05-14 11:43:15.713012] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1635946 ] 00:06:48.939 [2024-05-14 11:43:15.843621] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.939 [2024-05-14 11:43:15.945618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.939 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:50.313 ====================================== 00:06:50.313 busy:2309040754 (cyc) 00:06:50.313 total_run_count: 267000 00:06:50.313 tsc_hz: 2300000000 (cyc) 00:06:50.313 ====================================== 00:06:50.313 poller_cost: 8648 (cyc), 3760 (nsec) 00:06:50.313 00:06:50.313 real 0m1.360s 00:06:50.313 user 0m1.201s 00:06:50.313 sys 0m0.151s 00:06:50.313 11:43:17 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:50.313 11:43:17 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:50.313 ************************************ 00:06:50.313 END TEST thread_poller_perf 00:06:50.313 ************************************ 00:06:50.313 11:43:17 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:50.313 11:43:17 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:06:50.313 11:43:17 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:50.313 11:43:17 thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.313 ************************************ 00:06:50.313 START TEST thread_poller_perf 00:06:50.313 ************************************ 00:06:50.313 11:43:17 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:50.313 [2024-05-14 11:43:17.157874] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:50.313 [2024-05-14 11:43:17.157939] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636192 ] 00:06:50.313 [2024-05-14 11:43:17.276191] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.313 [2024-05-14 11:43:17.376882] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.313 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:51.689 ====================================== 00:06:51.689 busy:2302787132 (cyc) 00:06:51.689 total_run_count: 3517000 00:06:51.689 tsc_hz: 2300000000 (cyc) 00:06:51.689 ====================================== 00:06:51.689 poller_cost: 654 (cyc), 284 (nsec) 00:06:51.689 00:06:51.689 real 0m1.338s 00:06:51.689 user 0m1.210s 00:06:51.689 sys 0m0.122s 00:06:51.689 11:43:18 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:51.689 11:43:18 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:51.689 ************************************ 00:06:51.689 END TEST thread_poller_perf 00:06:51.689 ************************************ 00:06:51.689 11:43:18 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:51.689 00:06:51.689 real 0m2.977s 00:06:51.689 user 0m2.510s 00:06:51.689 sys 0m0.468s 00:06:51.689 11:43:18 thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:51.689 11:43:18 thread -- common/autotest_common.sh@10 -- # set +x 00:06:51.689 ************************************ 00:06:51.689 END TEST thread 00:06:51.689 ************************************ 00:06:51.689 11:43:18 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:51.689 11:43:18 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:51.689 11:43:18 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:51.689 11:43:18 -- common/autotest_common.sh@10 -- # set +x 00:06:51.689 ************************************ 00:06:51.689 START TEST accel 00:06:51.689 ************************************ 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:51.689 * Looking for test storage... 00:06:51.689 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:51.689 11:43:18 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:51.689 11:43:18 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:51.689 11:43:18 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:51.689 11:43:18 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1636498 00:06:51.689 11:43:18 accel -- accel/accel.sh@63 -- # waitforlisten 1636498 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@827 -- # '[' -z 1636498 ']' 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.689 11:43:18 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:51.689 11:43:18 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:51.689 11:43:18 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.689 11:43:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.689 11:43:18 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.689 11:43:18 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.689 11:43:18 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.689 11:43:18 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.689 11:43:18 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:51.689 11:43:18 accel -- accel/accel.sh@41 -- # jq -r . 00:06:51.689 [2024-05-14 11:43:18.751196] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:51.689 [2024-05-14 11:43:18.751256] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636498 ] 00:06:51.949 [2024-05-14 11:43:18.861977] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.949 [2024-05-14 11:43:18.961179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@860 -- # return 0 00:06:52.884 11:43:19 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:52.884 11:43:19 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:52.884 11:43:19 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:52.884 11:43:19 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:52.884 11:43:19 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:52.884 11:43:19 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.884 11:43:19 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.884 11:43:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.884 11:43:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.884 11:43:19 accel -- accel/accel.sh@75 -- # killprocess 1636498 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@946 -- # '[' -z 1636498 ']' 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@950 -- # kill -0 1636498 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@951 -- # uname 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1636498 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1636498' 00:06:52.884 killing process with pid 1636498 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@965 -- # kill 1636498 00:06:52.884 11:43:19 accel -- common/autotest_common.sh@970 -- # wait 1636498 00:06:53.143 11:43:20 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:53.143 11:43:20 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:53.143 11:43:20 accel -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:06:53.143 11:43:20 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:53.143 11:43:20 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.143 11:43:20 accel.accel_help -- common/autotest_common.sh@1121 -- # accel_perf -h 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:53.143 11:43:20 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:53.401 11:43:20 accel.accel_help -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:53.401 11:43:20 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:53.401 11:43:20 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:53.401 11:43:20 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:06:53.401 11:43:20 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:53.401 11:43:20 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.401 ************************************ 00:06:53.401 START TEST accel_missing_filename 00:06:53.401 ************************************ 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:53.401 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:53.401 11:43:20 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:53.401 [2024-05-14 11:43:20.379485] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:53.401 [2024-05-14 11:43:20.379544] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636722 ] 00:06:53.661 [2024-05-14 11:43:20.505884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.661 [2024-05-14 11:43:20.606798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.661 [2024-05-14 11:43:20.679185] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:53.921 [2024-05-14 11:43:20.752795] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:06:53.921 A filename is required. 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:53.921 00:06:53.921 real 0m0.506s 00:06:53.921 user 0m0.345s 00:06:53.921 sys 0m0.190s 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:53.921 11:43:20 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:53.921 ************************************ 00:06:53.921 END TEST accel_missing_filename 00:06:53.921 ************************************ 00:06:53.921 11:43:20 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.921 11:43:20 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:06:53.921 11:43:20 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:53.921 11:43:20 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.921 ************************************ 00:06:53.921 START TEST accel_compress_verify 00:06:53.921 ************************************ 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:53.921 11:43:20 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:53.921 11:43:20 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:53.921 [2024-05-14 11:43:20.958481] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:53.921 [2024-05-14 11:43:20.958538] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636750 ] 00:06:54.181 [2024-05-14 11:43:21.088543] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.181 [2024-05-14 11:43:21.189995] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.181 [2024-05-14 11:43:21.265669] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:54.439 [2024-05-14 11:43:21.339587] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:06:54.439 00:06:54.439 Compression does not support the verify option, aborting. 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:54.439 00:06:54.439 real 0m0.511s 00:06:54.439 user 0m0.331s 00:06:54.439 sys 0m0.204s 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:54.439 11:43:21 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:54.439 ************************************ 00:06:54.439 END TEST accel_compress_verify 00:06:54.439 ************************************ 00:06:54.439 11:43:21 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:54.439 11:43:21 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:06:54.439 11:43:21 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:54.439 11:43:21 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.439 ************************************ 00:06:54.439 START TEST accel_wrong_workload 00:06:54.439 ************************************ 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w foobar 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:54.439 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:06:54.439 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:54.699 11:43:21 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:54.699 Unsupported workload type: foobar 00:06:54.699 [2024-05-14 11:43:21.552200] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:54.699 accel_perf options: 00:06:54.699 [-h help message] 00:06:54.699 [-q queue depth per core] 00:06:54.699 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:54.699 [-T number of threads per core 00:06:54.699 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:54.699 [-t time in seconds] 00:06:54.699 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:54.699 [ dif_verify, , dif_generate, dif_generate_copy 00:06:54.699 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:54.699 [-l for compress/decompress workloads, name of uncompressed input file 00:06:54.699 [-S for crc32c workload, use this seed value (default 0) 00:06:54.699 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:54.699 [-f for fill workload, use this BYTE value (default 255) 00:06:54.699 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:54.699 [-y verify result if this switch is on] 00:06:54.699 [-a tasks to allocate per core (default: same value as -q)] 00:06:54.699 Can be used to spread operations across a wider range of memory. 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:54.699 00:06:54.699 real 0m0.039s 00:06:54.699 user 0m0.021s 00:06:54.699 sys 0m0.018s 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:54.699 11:43:21 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:54.699 ************************************ 00:06:54.699 END TEST accel_wrong_workload 00:06:54.699 ************************************ 00:06:54.699 Error: writing output failed: Broken pipe 00:06:54.699 11:43:21 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:54.699 11:43:21 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:06:54.699 11:43:21 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:54.699 11:43:21 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.699 ************************************ 00:06:54.699 START TEST accel_negative_buffers 00:06:54.699 ************************************ 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:54.699 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:54.699 11:43:21 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:54.699 -x option must be non-negative. 00:06:54.699 [2024-05-14 11:43:21.676269] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:54.699 accel_perf options: 00:06:54.699 [-h help message] 00:06:54.699 [-q queue depth per core] 00:06:54.699 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:54.699 [-T number of threads per core 00:06:54.699 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:54.699 [-t time in seconds] 00:06:54.699 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:54.699 [ dif_verify, , dif_generate, dif_generate_copy 00:06:54.699 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:54.699 [-l for compress/decompress workloads, name of uncompressed input file 00:06:54.699 [-S for crc32c workload, use this seed value (default 0) 00:06:54.699 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:54.699 [-f for fill workload, use this BYTE value (default 255) 00:06:54.699 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:54.699 [-y verify result if this switch is on] 00:06:54.699 [-a tasks to allocate per core (default: same value as -q)] 00:06:54.700 Can be used to spread operations across a wider range of memory. 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:54.700 00:06:54.700 real 0m0.042s 00:06:54.700 user 0m0.026s 00:06:54.700 sys 0m0.015s 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:54.700 11:43:21 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:54.700 ************************************ 00:06:54.700 END TEST accel_negative_buffers 00:06:54.700 ************************************ 00:06:54.700 Error: writing output failed: Broken pipe 00:06:54.700 11:43:21 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:54.700 11:43:21 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:06:54.700 11:43:21 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:54.700 11:43:21 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.700 ************************************ 00:06:54.700 START TEST accel_crc32c 00:06:54.700 ************************************ 00:06:54.700 11:43:21 accel.accel_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:54.700 11:43:21 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:54.959 [2024-05-14 11:43:21.810305] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:54.959 [2024-05-14 11:43:21.810374] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1636976 ] 00:06:54.959 [2024-05-14 11:43:21.940509] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.218 [2024-05-14 11:43:22.045544] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:55.218 11:43:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:56.778 11:43:23 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.778 00:06:56.778 real 0m1.517s 00:06:56.778 user 0m1.322s 00:06:56.778 sys 0m0.199s 00:06:56.778 11:43:23 accel.accel_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:56.778 11:43:23 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:56.778 ************************************ 00:06:56.778 END TEST accel_crc32c 00:06:56.778 ************************************ 00:06:56.778 11:43:23 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:56.778 11:43:23 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:06:56.778 11:43:23 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:56.778 11:43:23 accel -- common/autotest_common.sh@10 -- # set +x 00:06:56.778 ************************************ 00:06:56.778 START TEST accel_crc32c_C2 00:06:56.778 ************************************ 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:56.778 [2024-05-14 11:43:23.410384] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:56.778 [2024-05-14 11:43:23.410448] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637175 ] 00:06:56.778 [2024-05-14 11:43:23.539883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.778 [2024-05-14 11:43:23.641325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.778 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:56.779 11:43:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.156 00:06:58.156 real 0m1.508s 00:06:58.156 user 0m1.318s 00:06:58.156 sys 0m0.197s 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:58.156 11:43:24 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 ************************************ 00:06:58.156 END TEST accel_crc32c_C2 00:06:58.156 ************************************ 00:06:58.156 11:43:24 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:58.156 11:43:24 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:06:58.156 11:43:24 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:58.156 11:43:24 accel -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 ************************************ 00:06:58.156 START TEST accel_copy 00:06:58.156 ************************************ 00:06:58.156 11:43:24 accel.accel_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy -y 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:58.156 11:43:24 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:58.156 [2024-05-14 11:43:24.996492] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:58.156 [2024-05-14 11:43:24.996558] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637369 ] 00:06:58.156 [2024-05-14 11:43:25.125486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.156 [2024-05-14 11:43:25.227510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.415 11:43:25 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:59.792 11:43:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.792 00:06:59.792 real 0m1.512s 00:06:59.792 user 0m1.324s 00:06:59.792 sys 0m0.190s 00:06:59.792 11:43:26 accel.accel_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:59.792 11:43:26 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:59.792 ************************************ 00:06:59.792 END TEST accel_copy 00:06:59.792 ************************************ 00:06:59.792 11:43:26 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:59.792 11:43:26 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:06:59.792 11:43:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:59.792 11:43:26 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.792 ************************************ 00:06:59.792 START TEST accel_fill 00:06:59.792 ************************************ 00:06:59.792 11:43:26 accel.accel_fill -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:59.792 11:43:26 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:59.792 [2024-05-14 11:43:26.590282] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:06:59.792 [2024-05-14 11:43:26.590339] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637640 ] 00:06:59.792 [2024-05-14 11:43:26.717713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.792 [2024-05-14 11:43:26.815230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:00.099 11:43:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:01.054 11:43:28 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.054 00:07:01.054 real 0m1.497s 00:07:01.054 user 0m1.326s 00:07:01.054 sys 0m0.172s 00:07:01.054 11:43:28 accel.accel_fill -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:01.054 11:43:28 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:01.054 ************************************ 00:07:01.054 END TEST accel_fill 00:07:01.054 ************************************ 00:07:01.054 11:43:28 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:01.054 11:43:28 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:01.054 11:43:28 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:01.054 11:43:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.312 ************************************ 00:07:01.312 START TEST accel_copy_crc32c 00:07:01.312 ************************************ 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:01.312 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:01.312 [2024-05-14 11:43:28.184465] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:01.312 [2024-05-14 11:43:28.184528] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637932 ] 00:07:01.312 [2024-05-14 11:43:28.299634] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.571 [2024-05-14 11:43:28.406494] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.571 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:01.572 11:43:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.947 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.948 00:07:02.948 real 0m1.500s 00:07:02.948 user 0m1.326s 00:07:02.948 sys 0m0.179s 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:02.948 11:43:29 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:02.948 ************************************ 00:07:02.948 END TEST accel_copy_crc32c 00:07:02.948 ************************************ 00:07:02.948 11:43:29 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:02.948 11:43:29 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:02.948 11:43:29 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:02.948 11:43:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.948 ************************************ 00:07:02.948 START TEST accel_copy_crc32c_C2 00:07:02.948 ************************************ 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:02.948 11:43:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:02.948 [2024-05-14 11:43:29.769042] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:02.948 [2024-05-14 11:43:29.769103] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1638125 ] 00:07:02.948 [2024-05-14 11:43:29.897468] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.948 [2024-05-14 11:43:29.998124] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:03.207 11:43:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.583 00:07:04.583 real 0m1.513s 00:07:04.583 user 0m1.321s 00:07:04.583 sys 0m0.190s 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:04.583 11:43:31 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:04.583 ************************************ 00:07:04.583 END TEST accel_copy_crc32c_C2 00:07:04.583 ************************************ 00:07:04.583 11:43:31 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:04.583 11:43:31 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:04.583 11:43:31 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:04.583 11:43:31 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.583 ************************************ 00:07:04.583 START TEST accel_dualcast 00:07:04.583 ************************************ 00:07:04.583 11:43:31 accel.accel_dualcast -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dualcast -y 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:04.583 11:43:31 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:04.583 [2024-05-14 11:43:31.368316] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:04.583 [2024-05-14 11:43:31.368376] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1638325 ] 00:07:04.583 [2024-05-14 11:43:31.497784] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.583 [2024-05-14 11:43:31.598388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.842 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:04.843 11:43:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.779 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:05.780 11:43:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.780 00:07:05.780 real 0m1.511s 00:07:05.780 user 0m1.312s 00:07:05.780 sys 0m0.196s 00:07:05.780 11:43:32 accel.accel_dualcast -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:05.780 11:43:32 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:05.780 ************************************ 00:07:05.780 END TEST accel_dualcast 00:07:05.780 ************************************ 00:07:06.038 11:43:32 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:06.038 11:43:32 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:06.038 11:43:32 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:06.038 11:43:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:06.038 ************************************ 00:07:06.038 START TEST accel_compare 00:07:06.038 ************************************ 00:07:06.038 11:43:32 accel.accel_compare -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compare -y 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:06.038 11:43:32 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:06.038 [2024-05-14 11:43:32.969376] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:06.038 [2024-05-14 11:43:32.969452] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1638522 ] 00:07:06.038 [2024-05-14 11:43:33.097177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.296 [2024-05-14 11:43:33.196289] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:06.296 11:43:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:07.668 11:43:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.668 00:07:07.668 real 0m1.491s 00:07:07.668 user 0m1.320s 00:07:07.668 sys 0m0.175s 00:07:07.668 11:43:34 accel.accel_compare -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:07.668 11:43:34 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:07.668 ************************************ 00:07:07.668 END TEST accel_compare 00:07:07.668 ************************************ 00:07:07.668 11:43:34 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:07.668 11:43:34 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:07.668 11:43:34 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:07.668 11:43:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.668 ************************************ 00:07:07.668 START TEST accel_xor 00:07:07.668 ************************************ 00:07:07.668 11:43:34 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:07.668 11:43:34 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:07.668 [2024-05-14 11:43:34.544417] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:07.668 [2024-05-14 11:43:34.544477] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1638737 ] 00:07:07.668 [2024-05-14 11:43:34.674960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.926 [2024-05-14 11:43:34.777315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.926 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:07.927 11:43:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.302 00:07:09.302 real 0m1.511s 00:07:09.302 user 0m1.319s 00:07:09.302 sys 0m0.191s 00:07:09.302 11:43:36 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:09.302 11:43:36 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:09.302 ************************************ 00:07:09.302 END TEST accel_xor 00:07:09.302 ************************************ 00:07:09.302 11:43:36 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:09.302 11:43:36 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:09.302 11:43:36 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:09.302 11:43:36 accel -- common/autotest_common.sh@10 -- # set +x 00:07:09.302 ************************************ 00:07:09.302 START TEST accel_xor 00:07:09.302 ************************************ 00:07:09.302 11:43:36 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y -x 3 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:09.302 11:43:36 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:09.302 [2024-05-14 11:43:36.139253] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:09.302 [2024-05-14 11:43:36.139310] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1639043 ] 00:07:09.302 [2024-05-14 11:43:36.267431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.302 [2024-05-14 11:43:36.368684] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:09.561 11:43:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:10.935 11:43:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.935 00:07:10.935 real 0m1.506s 00:07:10.935 user 0m1.320s 00:07:10.935 sys 0m0.190s 00:07:10.935 11:43:37 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:10.935 11:43:37 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:10.935 ************************************ 00:07:10.935 END TEST accel_xor 00:07:10.935 ************************************ 00:07:10.936 11:43:37 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:10.936 11:43:37 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:10.936 11:43:37 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:10.936 11:43:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:10.936 ************************************ 00:07:10.936 START TEST accel_dif_verify 00:07:10.936 ************************************ 00:07:10.936 11:43:37 accel.accel_dif_verify -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_verify 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:10.936 11:43:37 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:10.936 [2024-05-14 11:43:37.731883] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:10.936 [2024-05-14 11:43:37.731941] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1639276 ] 00:07:10.936 [2024-05-14 11:43:37.861162] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.936 [2024-05-14 11:43:37.961553] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:11.195 11:43:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:12.130 11:43:39 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.130 00:07:12.130 real 0m1.508s 00:07:12.130 user 0m1.321s 00:07:12.130 sys 0m0.194s 00:07:12.130 11:43:39 accel.accel_dif_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:12.130 11:43:39 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:12.130 ************************************ 00:07:12.130 END TEST accel_dif_verify 00:07:12.130 ************************************ 00:07:12.388 11:43:39 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:12.388 11:43:39 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:12.388 11:43:39 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:12.388 11:43:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:12.388 ************************************ 00:07:12.388 START TEST accel_dif_generate 00:07:12.388 ************************************ 00:07:12.388 11:43:39 accel.accel_dif_generate -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:12.388 11:43:39 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:12.388 [2024-05-14 11:43:39.337427] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:12.388 [2024-05-14 11:43:39.337498] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1639468 ] 00:07:12.388 [2024-05-14 11:43:39.469216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.646 [2024-05-14 11:43:39.572395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:12.647 11:43:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.022 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:14.023 11:43:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.023 00:07:14.023 real 0m1.511s 00:07:14.023 user 0m1.331s 00:07:14.023 sys 0m0.182s 00:07:14.023 11:43:40 accel.accel_dif_generate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:14.023 11:43:40 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:14.023 ************************************ 00:07:14.023 END TEST accel_dif_generate 00:07:14.023 ************************************ 00:07:14.023 11:43:40 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:14.023 11:43:40 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:14.023 11:43:40 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:14.023 11:43:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.023 ************************************ 00:07:14.023 START TEST accel_dif_generate_copy 00:07:14.023 ************************************ 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate_copy 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:14.023 11:43:40 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:14.023 [2024-05-14 11:43:40.936200] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:14.023 [2024-05-14 11:43:40.936255] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1639674 ] 00:07:14.023 [2024-05-14 11:43:41.064288] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.281 [2024-05-14 11:43:41.166121] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:14.281 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:14.282 11:43:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.654 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.654 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.654 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.655 00:07:15.655 real 0m1.511s 00:07:15.655 user 0m1.327s 00:07:15.655 sys 0m0.187s 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:15.655 11:43:42 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:15.655 ************************************ 00:07:15.655 END TEST accel_dif_generate_copy 00:07:15.655 ************************************ 00:07:15.655 11:43:42 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:15.655 11:43:42 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.655 11:43:42 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:15.655 11:43:42 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:15.655 11:43:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.655 ************************************ 00:07:15.655 START TEST accel_comp 00:07:15.655 ************************************ 00:07:15.655 11:43:42 accel.accel_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:15.655 11:43:42 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:15.655 [2024-05-14 11:43:42.532600] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:15.655 [2024-05-14 11:43:42.532660] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1639872 ] 00:07:15.655 [2024-05-14 11:43:42.662647] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.914 [2024-05-14 11:43:42.765652] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.914 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:15.915 11:43:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:17.328 11:43:44 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.328 00:07:17.328 real 0m1.516s 00:07:17.328 user 0m1.333s 00:07:17.328 sys 0m0.188s 00:07:17.328 11:43:44 accel.accel_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:17.328 11:43:44 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:17.328 ************************************ 00:07:17.328 END TEST accel_comp 00:07:17.328 ************************************ 00:07:17.328 11:43:44 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:17.328 11:43:44 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:17.328 11:43:44 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:17.328 11:43:44 accel -- common/autotest_common.sh@10 -- # set +x 00:07:17.328 ************************************ 00:07:17.328 START TEST accel_decomp 00:07:17.328 ************************************ 00:07:17.328 11:43:44 accel.accel_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:17.328 11:43:44 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:17.328 [2024-05-14 11:43:44.137097] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:17.328 [2024-05-14 11:43:44.137155] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640158 ] 00:07:17.328 [2024-05-14 11:43:44.265661] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.328 [2024-05-14 11:43:44.366096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.587 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:17.588 11:43:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.524 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.524 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.524 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.524 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:18.782 11:43:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.782 00:07:18.782 real 0m1.512s 00:07:18.782 user 0m1.325s 00:07:18.782 sys 0m0.189s 00:07:18.782 11:43:45 accel.accel_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:18.782 11:43:45 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:18.782 ************************************ 00:07:18.782 END TEST accel_decomp 00:07:18.782 ************************************ 00:07:18.782 11:43:45 accel -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.782 11:43:45 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:18.782 11:43:45 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:18.782 11:43:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:18.782 ************************************ 00:07:18.782 START TEST accel_decmop_full 00:07:18.782 ************************************ 00:07:18.782 11:43:45 accel.accel_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.782 11:43:45 accel.accel_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.783 11:43:45 accel.accel_decmop_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:18.783 11:43:45 accel.accel_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:07:18.783 11:43:45 accel.accel_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:07:18.783 [2024-05-14 11:43:45.734715] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:18.783 [2024-05-14 11:43:45.734774] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640425 ] 00:07:18.783 [2024-05-14 11:43:45.861119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.041 [2024-05-14 11:43:45.963649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=software 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@22 -- # accel_module=software 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.041 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=1 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:19.042 11:43:46 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:20.418 11:43:47 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.418 00:07:20.418 real 0m1.525s 00:07:20.418 user 0m1.339s 00:07:20.418 sys 0m0.188s 00:07:20.418 11:43:47 accel.accel_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:20.418 11:43:47 accel.accel_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:07:20.418 ************************************ 00:07:20.418 END TEST accel_decmop_full 00:07:20.418 ************************************ 00:07:20.418 11:43:47 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.418 11:43:47 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:20.418 11:43:47 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:20.418 11:43:47 accel -- common/autotest_common.sh@10 -- # set +x 00:07:20.418 ************************************ 00:07:20.418 START TEST accel_decomp_mcore 00:07:20.418 ************************************ 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:20.418 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:20.418 [2024-05-14 11:43:47.348726] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:20.418 [2024-05-14 11:43:47.348782] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640624 ] 00:07:20.418 [2024-05-14 11:43:47.476696] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:20.676 [2024-05-14 11:43:47.578774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.676 [2024-05-14 11:43:47.578858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.676 [2024-05-14 11:43:47.578968] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.676 [2024-05-14 11:43:47.578969] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.676 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.677 11:43:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.049 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.050 00:07:22.050 real 0m1.498s 00:07:22.050 user 0m4.717s 00:07:22.050 sys 0m0.187s 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:22.050 11:43:48 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:22.050 ************************************ 00:07:22.050 END TEST accel_decomp_mcore 00:07:22.050 ************************************ 00:07:22.050 11:43:48 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.050 11:43:48 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:22.050 11:43:48 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:22.050 11:43:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.050 ************************************ 00:07:22.050 START TEST accel_decomp_full_mcore 00:07:22.050 ************************************ 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:22.050 11:43:48 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:22.050 [2024-05-14 11:43:48.924171] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:22.050 [2024-05-14 11:43:48.924230] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640826 ] 00:07:22.050 [2024-05-14 11:43:49.055187] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.310 [2024-05-14 11:43:49.161156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.310 [2024-05-14 11:43:49.161242] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.310 [2024-05-14 11:43:49.161349] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.310 [2024-05-14 11:43:49.161350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:22.310 11:43:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.684 00:07:23.684 real 0m1.537s 00:07:23.684 user 0m4.800s 00:07:23.684 sys 0m0.220s 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:23.684 11:43:50 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:23.684 ************************************ 00:07:23.684 END TEST accel_decomp_full_mcore 00:07:23.684 ************************************ 00:07:23.684 11:43:50 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.684 11:43:50 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:23.684 11:43:50 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:23.684 11:43:50 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.684 ************************************ 00:07:23.684 START TEST accel_decomp_mthread 00:07:23.684 ************************************ 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:23.684 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:23.684 [2024-05-14 11:43:50.554000] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:23.684 [2024-05-14 11:43:50.554065] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641022 ] 00:07:23.684 [2024-05-14 11:43:50.686672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.942 [2024-05-14 11:43:50.792969] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.942 11:43:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.316 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.317 00:07:25.317 real 0m1.526s 00:07:25.317 user 0m1.324s 00:07:25.317 sys 0m0.205s 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.317 11:43:52 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:25.317 ************************************ 00:07:25.317 END TEST accel_decomp_mthread 00:07:25.317 ************************************ 00:07:25.317 11:43:52 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.317 11:43:52 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:25.317 11:43:52 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:25.317 11:43:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.317 ************************************ 00:07:25.317 START TEST accel_decomp_full_mthread 00:07:25.317 ************************************ 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:25.317 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:25.317 [2024-05-14 11:43:52.167297] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:25.317 [2024-05-14 11:43:52.167362] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641260 ] 00:07:25.317 [2024-05-14 11:43:52.295656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.317 [2024-05-14 11:43:52.396883] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.575 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.576 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:25.576 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:25.576 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:25.576 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:25.576 11:43:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.949 00:07:26.949 real 0m1.548s 00:07:26.949 user 0m1.362s 00:07:26.949 sys 0m0.188s 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:26.949 11:43:53 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:26.949 ************************************ 00:07:26.949 END TEST accel_decomp_full_mthread 00:07:26.949 ************************************ 00:07:26.949 11:43:53 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:26.949 11:43:53 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:26.949 11:43:53 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:26.949 11:43:53 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:26.949 11:43:53 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1641565 00:07:26.949 11:43:53 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:26.949 11:43:53 accel -- accel/accel.sh@63 -- # waitforlisten 1641565 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@827 -- # '[' -z 1641565 ']' 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.949 11:43:53 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:26.949 11:43:53 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.949 11:43:53 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:26.949 11:43:53 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.949 11:43:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.949 11:43:53 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.949 11:43:53 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:26.949 11:43:53 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:26.949 11:43:53 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:26.949 11:43:53 accel -- accel/accel.sh@41 -- # jq -r . 00:07:26.949 [2024-05-14 11:43:53.794205] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:26.949 [2024-05-14 11:43:53.794269] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641565 ] 00:07:26.949 [2024-05-14 11:43:53.921105] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.949 [2024-05-14 11:43:54.018998] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.884 [2024-05-14 11:43:54.777505] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:28.143 11:43:54 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:28.143 11:43:54 accel -- common/autotest_common.sh@860 -- # return 0 00:07:28.143 11:43:54 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:28.143 11:43:54 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:28.143 11:43:54 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:28.143 11:43:54 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:28.143 11:43:54 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:28.143 11:43:54 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:28.143 11:43:54 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:28.143 11:43:54 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:28.143 11:43:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.143 11:43:54 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:28.143 "method": "compressdev_scan_accel_module", 00:07:28.143 11:43:55 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:28.143 11:43:55 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:28.143 11:43:55 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # IFS== 00:07:28.143 11:43:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:28.143 11:43:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:28.143 11:43:55 accel -- accel/accel.sh@75 -- # killprocess 1641565 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@946 -- # '[' -z 1641565 ']' 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@950 -- # kill -0 1641565 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@951 -- # uname 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:28.143 11:43:55 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1641565 00:07:28.401 11:43:55 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:28.401 11:43:55 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:28.401 11:43:55 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1641565' 00:07:28.401 killing process with pid 1641565 00:07:28.401 11:43:55 accel -- common/autotest_common.sh@965 -- # kill 1641565 00:07:28.401 11:43:55 accel -- common/autotest_common.sh@970 -- # wait 1641565 00:07:28.659 11:43:55 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:28.659 11:43:55 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.659 11:43:55 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:28.659 11:43:55 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:28.659 11:43:55 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.659 ************************************ 00:07:28.659 START TEST accel_cdev_comp 00:07:28.659 ************************************ 00:07:28.659 11:43:55 accel.accel_cdev_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:28.659 11:43:55 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:28.659 [2024-05-14 11:43:55.703216] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:28.659 [2024-05-14 11:43:55.703274] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641792 ] 00:07:28.916 [2024-05-14 11:43:55.831718] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.916 [2024-05-14 11:43:55.932338] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.848 [2024-05-14 11:43:56.692589] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:29.848 [2024-05-14 11:43:56.695166] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x216a0c0 PMD being used: compress_qat 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.848 [2024-05-14 11:43:56.699144] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x216ee50 PMD being used: compress_qat 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.848 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:29.849 11:43:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:31.222 11:43:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:31.222 00:07:31.222 real 0m2.212s 00:07:31.222 user 0m1.635s 00:07:31.222 sys 0m0.577s 00:07:31.222 11:43:57 accel.accel_cdev_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:31.222 11:43:57 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:31.222 ************************************ 00:07:31.222 END TEST accel_cdev_comp 00:07:31.222 ************************************ 00:07:31.222 11:43:57 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:31.222 11:43:57 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:31.222 11:43:57 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:31.222 11:43:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.222 ************************************ 00:07:31.222 START TEST accel_cdev_decomp 00:07:31.222 ************************************ 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:31.222 11:43:57 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:31.222 [2024-05-14 11:43:58.002142] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:31.222 [2024-05-14 11:43:58.002201] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1642162 ] 00:07:31.222 [2024-05-14 11:43:58.129756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.223 [2024-05-14 11:43:58.228774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.155 [2024-05-14 11:43:59.003826] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:32.155 [2024-05-14 11:43:59.006462] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a10c0 PMD being used: compress_qat 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 [2024-05-14 11:43:59.010626] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a5e70 PMD being used: compress_qat 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:32.155 11:43:59 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.546 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:33.547 00:07:33.547 real 0m2.221s 00:07:33.547 user 0m1.645s 00:07:33.547 sys 0m0.579s 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:33.547 11:44:00 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:33.547 ************************************ 00:07:33.547 END TEST accel_cdev_decomp 00:07:33.547 ************************************ 00:07:33.547 11:44:00 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:33.547 11:44:00 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:33.547 11:44:00 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:33.547 11:44:00 accel -- common/autotest_common.sh@10 -- # set +x 00:07:33.547 ************************************ 00:07:33.547 START TEST accel_cdev_decmop_full 00:07:33.547 ************************************ 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:07:33.547 11:44:00 accel.accel_cdev_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:07:33.547 [2024-05-14 11:44:00.311511] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:33.547 [2024-05-14 11:44:00.311573] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1642391 ] 00:07:33.547 [2024-05-14 11:44:00.441215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.547 [2024-05-14 11:44:00.541080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.490 [2024-05-14 11:44:01.303607] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:34.490 [2024-05-14 11:44:01.306225] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x130a0c0 PMD being used: compress_qat 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 [2024-05-14 11:44:01.309538] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x130d470 PMD being used: compress_qat 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=1 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:34.490 11:44:01 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:35.422 00:07:35.422 real 0m2.216s 00:07:35.422 user 0m1.623s 00:07:35.422 sys 0m0.592s 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:35.422 11:44:02 accel.accel_cdev_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:07:35.422 ************************************ 00:07:35.422 END TEST accel_cdev_decmop_full 00:07:35.422 ************************************ 00:07:35.680 11:44:02 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:35.680 11:44:02 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:35.680 11:44:02 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:35.680 11:44:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:35.680 ************************************ 00:07:35.680 START TEST accel_cdev_decomp_mcore 00:07:35.680 ************************************ 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:35.680 11:44:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:35.680 [2024-05-14 11:44:02.612328] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:35.680 [2024-05-14 11:44:02.612384] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1642735 ] 00:07:35.680 [2024-05-14 11:44:02.739903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:35.938 [2024-05-14 11:44:02.845689] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:35.938 [2024-05-14 11:44:02.845776] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:35.938 [2024-05-14 11:44:02.845870] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:35.938 [2024-05-14 11:44:02.845872] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.872 [2024-05-14 11:44:03.611239] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:36.872 [2024-05-14 11:44:03.613854] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21436e0 PMD being used: compress_qat 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.872 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 [2024-05-14 11:44:03.619496] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7e2419b890 PMD being used: compress_qat 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 [2024-05-14 11:44:03.620254] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7e1c19b890 PMD being used: compress_qat 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 [2024-05-14 11:44:03.621341] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2148c50 PMD being used: compress_qat 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.873 [2024-05-14 11:44:03.621534] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7e1419b890 PMD being used: compress_qat 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:36.873 11:44:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.807 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:37.808 00:07:37.808 real 0m2.224s 00:07:37.808 user 0m7.190s 00:07:37.808 sys 0m0.596s 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:37.808 11:44:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:37.808 ************************************ 00:07:37.808 END TEST accel_cdev_decomp_mcore 00:07:37.808 ************************************ 00:07:37.808 11:44:04 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:37.808 11:44:04 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:37.808 11:44:04 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:37.808 11:44:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.808 ************************************ 00:07:37.808 START TEST accel_cdev_decomp_full_mcore 00:07:37.808 ************************************ 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:37.808 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:38.066 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:38.066 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:38.066 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.066 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:38.067 11:44:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:38.067 [2024-05-14 11:44:04.922859] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:38.067 [2024-05-14 11:44:04.922916] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1643105 ] 00:07:38.067 [2024-05-14 11:44:05.050451] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:38.325 [2024-05-14 11:44:05.154500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.325 [2024-05-14 11:44:05.154525] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:38.325 [2024-05-14 11:44:05.154616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:38.325 [2024-05-14 11:44:05.154619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.891 [2024-05-14 11:44:05.918903] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:38.891 [2024-05-14 11:44:05.921418] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd556e0 PMD being used: compress_qat 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 [2024-05-14 11:44:05.926049] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9bc419b890 PMD being used: compress_qat 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:38.891 [2024-05-14 11:44:05.926783] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9bbc19b890 PMD being used: compress_qat 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 [2024-05-14 11:44:05.927795] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd55780 PMD being used: compress_qat 00:07:38.891 [2024-05-14 11:44:05.927951] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9bb419b890 PMD being used: compress_qat 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:38.891 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:38.892 11:44:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.266 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:40.267 00:07:40.267 real 0m2.239s 00:07:40.267 user 0m7.236s 00:07:40.267 sys 0m0.598s 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:40.267 11:44:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:40.267 ************************************ 00:07:40.267 END TEST accel_cdev_decomp_full_mcore 00:07:40.267 ************************************ 00:07:40.267 11:44:07 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:40.267 11:44:07 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:40.267 11:44:07 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:40.267 11:44:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.267 ************************************ 00:07:40.267 START TEST accel_cdev_decomp_mthread 00:07:40.267 ************************************ 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:40.267 11:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:40.267 [2024-05-14 11:44:07.237293] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:40.267 [2024-05-14 11:44:07.237350] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1643410 ] 00:07:40.525 [2024-05-14 11:44:07.367038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.525 [2024-05-14 11:44:07.470725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.468 [2024-05-14 11:44:08.242583] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:41.468 [2024-05-14 11:44:08.245152] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e0f0c0 PMD being used: compress_qat 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 [2024-05-14 11:44:08.250165] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e14220 PMD being used: compress_qat 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 [2024-05-14 11:44:08.252751] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f37010 PMD being used: compress_qat 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:41.468 11:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:42.402 00:07:42.402 real 0m2.235s 00:07:42.402 user 0m1.659s 00:07:42.402 sys 0m0.580s 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:42.402 11:44:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:42.402 ************************************ 00:07:42.402 END TEST accel_cdev_decomp_mthread 00:07:42.402 ************************************ 00:07:42.402 11:44:09 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.402 11:44:09 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:42.402 11:44:09 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:42.402 11:44:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.660 ************************************ 00:07:42.660 START TEST accel_cdev_decomp_full_mthread 00:07:42.660 ************************************ 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.660 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:42.661 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:42.661 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:42.661 11:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:42.661 [2024-05-14 11:44:09.558114] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:42.661 [2024-05-14 11:44:09.558172] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1643675 ] 00:07:42.661 [2024-05-14 11:44:09.687930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.919 [2024-05-14 11:44:09.793920] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.485 [2024-05-14 11:44:10.565931] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:43.485 [2024-05-14 11:44:10.568574] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8fa0c0 PMD being used: compress_qat 00:07:43.485 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.485 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.486 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.486 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.744 [2024-05-14 11:44:10.572741] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8fd470 PMD being used: compress_qat 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.744 [2024-05-14 11:44:10.575655] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa21c40 PMD being used: compress_qat 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:43.744 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:43.745 11:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.680 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:44.681 00:07:44.681 real 0m2.239s 00:07:44.681 user 0m1.646s 00:07:44.681 sys 0m0.596s 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:44.681 11:44:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:44.681 ************************************ 00:07:44.681 END TEST accel_cdev_decomp_full_mthread 00:07:44.681 ************************************ 00:07:44.940 11:44:11 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:44.940 11:44:11 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:44.940 11:44:11 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:44.940 11:44:11 accel -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:07:44.940 11:44:11 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.940 11:44:11 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:44.940 11:44:11 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.940 11:44:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.940 11:44:11 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.940 11:44:11 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.940 11:44:11 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.940 11:44:11 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:44.940 11:44:11 accel -- accel/accel.sh@41 -- # jq -r . 00:07:44.940 ************************************ 00:07:44.940 START TEST accel_dif_functional_tests 00:07:44.940 ************************************ 00:07:44.940 11:44:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:44.940 [2024-05-14 11:44:11.913855] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:44.940 [2024-05-14 11:44:11.913917] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1644040 ] 00:07:45.197 [2024-05-14 11:44:12.041325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:45.197 [2024-05-14 11:44:12.145157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:45.197 [2024-05-14 11:44:12.145243] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:45.197 [2024-05-14 11:44:12.145248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.197 00:07:45.197 00:07:45.197 CUnit - A unit testing framework for C - Version 2.1-3 00:07:45.197 http://cunit.sourceforge.net/ 00:07:45.197 00:07:45.197 00:07:45.197 Suite: accel_dif 00:07:45.197 Test: verify: DIF generated, GUARD check ...passed 00:07:45.197 Test: verify: DIF generated, APPTAG check ...passed 00:07:45.197 Test: verify: DIF generated, REFTAG check ...passed 00:07:45.197 Test: verify: DIF not generated, GUARD check ...[2024-05-14 11:44:12.239798] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:45.197 [2024-05-14 11:44:12.239855] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:45.197 passed 00:07:45.197 Test: verify: DIF not generated, APPTAG check ...[2024-05-14 11:44:12.239895] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:45.197 [2024-05-14 11:44:12.239919] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:45.197 passed 00:07:45.197 Test: verify: DIF not generated, REFTAG check ...[2024-05-14 11:44:12.239946] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:45.197 [2024-05-14 11:44:12.239976] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:45.197 passed 00:07:45.197 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:45.198 Test: verify: APPTAG incorrect, APPTAG check ...[2024-05-14 11:44:12.240039] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:45.198 passed 00:07:45.198 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:45.198 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:45.198 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:45.198 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-05-14 11:44:12.240187] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:45.198 passed 00:07:45.198 Test: generate copy: DIF generated, GUARD check ...passed 00:07:45.198 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:45.198 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:45.198 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:45.198 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:45.198 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:45.198 Test: generate copy: iovecs-len validate ...[2024-05-14 11:44:12.240434] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:45.198 passed 00:07:45.198 Test: generate copy: buffer alignment validate ...passed 00:07:45.198 00:07:45.198 Run Summary: Type Total Ran Passed Failed Inactive 00:07:45.198 suites 1 1 n/a 0 0 00:07:45.198 tests 20 20 20 0 0 00:07:45.198 asserts 204 204 204 0 n/a 00:07:45.198 00:07:45.198 Elapsed time = 0.002 seconds 00:07:45.455 00:07:45.455 real 0m0.601s 00:07:45.455 user 0m0.769s 00:07:45.455 sys 0m0.230s 00:07:45.455 11:44:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:45.455 11:44:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 ************************************ 00:07:45.455 END TEST accel_dif_functional_tests 00:07:45.455 ************************************ 00:07:45.455 00:07:45.455 real 0m53.919s 00:07:45.455 user 1m2.007s 00:07:45.455 sys 0m11.932s 00:07:45.455 11:44:12 accel -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:45.455 11:44:12 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.455 ************************************ 00:07:45.455 END TEST accel 00:07:45.455 ************************************ 00:07:45.713 11:44:12 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:45.713 11:44:12 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:45.713 11:44:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:45.713 11:44:12 -- common/autotest_common.sh@10 -- # set +x 00:07:45.713 ************************************ 00:07:45.713 START TEST accel_rpc 00:07:45.713 ************************************ 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:45.713 * Looking for test storage... 00:07:45.713 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:45.713 11:44:12 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:45.713 11:44:12 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1644243 00:07:45.713 11:44:12 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1644243 00:07:45.713 11:44:12 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@827 -- # '[' -z 1644243 ']' 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.713 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:45.713 11:44:12 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:45.713 [2024-05-14 11:44:12.762212] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:45.713 [2024-05-14 11:44:12.762285] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1644243 ] 00:07:45.971 [2024-05-14 11:44:12.889256] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.971 [2024-05-14 11:44:12.995634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.905 11:44:13 accel_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:46.905 11:44:13 accel_rpc -- common/autotest_common.sh@860 -- # return 0 00:07:46.905 11:44:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:46.905 11:44:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:46.905 11:44:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:46.905 11:44:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:46.905 11:44:13 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:46.905 11:44:13 accel_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:46.905 11:44:13 accel_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:46.905 11:44:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:46.905 ************************************ 00:07:46.905 START TEST accel_assign_opcode 00:07:46.905 ************************************ 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1121 -- # accel_assign_opcode_test_suite 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:46.905 [2024-05-14 11:44:13.717935] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:46.905 [2024-05-14 11:44:13.725968] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:46.905 11:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.163 software 00:07:47.163 00:07:47.163 real 0m0.300s 00:07:47.163 user 0m0.049s 00:07:47.163 sys 0m0.012s 00:07:47.163 11:44:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:47.163 11:44:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:47.163 ************************************ 00:07:47.163 END TEST accel_assign_opcode 00:07:47.163 ************************************ 00:07:47.163 11:44:14 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1644243 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@946 -- # '[' -z 1644243 ']' 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@950 -- # kill -0 1644243 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@951 -- # uname 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1644243 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1644243' 00:07:47.163 killing process with pid 1644243 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@965 -- # kill 1644243 00:07:47.163 11:44:14 accel_rpc -- common/autotest_common.sh@970 -- # wait 1644243 00:07:47.421 00:07:47.421 real 0m1.897s 00:07:47.421 user 0m1.941s 00:07:47.421 sys 0m0.595s 00:07:47.421 11:44:14 accel_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:47.421 11:44:14 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.421 ************************************ 00:07:47.421 END TEST accel_rpc 00:07:47.421 ************************************ 00:07:47.679 11:44:14 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:47.679 11:44:14 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:47.679 11:44:14 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:47.679 11:44:14 -- common/autotest_common.sh@10 -- # set +x 00:07:47.679 ************************************ 00:07:47.679 START TEST app_cmdline 00:07:47.679 ************************************ 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:47.679 * Looking for test storage... 00:07:47.679 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:47.679 11:44:14 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:47.679 11:44:14 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1644529 00:07:47.679 11:44:14 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1644529 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@827 -- # '[' -z 1644529 ']' 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:47.679 11:44:14 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:47.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:47.679 11:44:14 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:47.679 [2024-05-14 11:44:14.757019] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:47.679 [2024-05-14 11:44:14.757091] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1644529 ] 00:07:47.937 [2024-05-14 11:44:14.885314] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.937 [2024-05-14 11:44:14.989576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@860 -- # return 0 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:48.939 { 00:07:48.939 "version": "SPDK v24.05-pre git sha1 b68ae4fb9", 00:07:48.939 "fields": { 00:07:48.939 "major": 24, 00:07:48.939 "minor": 5, 00:07:48.939 "patch": 0, 00:07:48.939 "suffix": "-pre", 00:07:48.939 "commit": "b68ae4fb9" 00:07:48.939 } 00:07:48.939 } 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:48.939 11:44:15 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:48.939 11:44:15 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.198 request: 00:07:49.198 { 00:07:49.198 "method": "env_dpdk_get_mem_stats", 00:07:49.198 "req_id": 1 00:07:49.198 } 00:07:49.198 Got JSON-RPC error response 00:07:49.198 response: 00:07:49.198 { 00:07:49.198 "code": -32601, 00:07:49.198 "message": "Method not found" 00:07:49.198 } 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:49.198 11:44:16 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1644529 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@946 -- # '[' -z 1644529 ']' 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@950 -- # kill -0 1644529 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@951 -- # uname 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1644529 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1644529' 00:07:49.198 killing process with pid 1644529 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@965 -- # kill 1644529 00:07:49.198 11:44:16 app_cmdline -- common/autotest_common.sh@970 -- # wait 1644529 00:07:49.457 00:07:49.457 real 0m1.916s 00:07:49.457 user 0m2.200s 00:07:49.457 sys 0m0.599s 00:07:49.457 11:44:16 app_cmdline -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:49.457 11:44:16 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:49.457 ************************************ 00:07:49.457 END TEST app_cmdline 00:07:49.457 ************************************ 00:07:49.457 11:44:16 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:49.457 11:44:16 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:49.457 11:44:16 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:49.457 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:07:49.715 ************************************ 00:07:49.715 START TEST version 00:07:49.715 ************************************ 00:07:49.715 11:44:16 version -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:49.715 * Looking for test storage... 00:07:49.715 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:49.715 11:44:16 version -- app/version.sh@17 -- # get_header_version major 00:07:49.715 11:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # cut -f2 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:07:49.715 11:44:16 version -- app/version.sh@17 -- # major=24 00:07:49.715 11:44:16 version -- app/version.sh@18 -- # get_header_version minor 00:07:49.715 11:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # cut -f2 00:07:49.715 11:44:16 version -- app/version.sh@18 -- # minor=5 00:07:49.715 11:44:16 version -- app/version.sh@19 -- # get_header_version patch 00:07:49.715 11:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # cut -f2 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:07:49.715 11:44:16 version -- app/version.sh@19 -- # patch=0 00:07:49.715 11:44:16 version -- app/version.sh@20 -- # get_header_version suffix 00:07:49.715 11:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # cut -f2 00:07:49.715 11:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:07:49.715 11:44:16 version -- app/version.sh@20 -- # suffix=-pre 00:07:49.715 11:44:16 version -- app/version.sh@22 -- # version=24.5 00:07:49.715 11:44:16 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:49.715 11:44:16 version -- app/version.sh@28 -- # version=24.5rc0 00:07:49.715 11:44:16 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:49.715 11:44:16 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:49.715 11:44:16 version -- app/version.sh@30 -- # py_version=24.5rc0 00:07:49.715 11:44:16 version -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:07:49.715 00:07:49.715 real 0m0.182s 00:07:49.715 user 0m0.093s 00:07:49.715 sys 0m0.135s 00:07:49.715 11:44:16 version -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:49.715 11:44:16 version -- common/autotest_common.sh@10 -- # set +x 00:07:49.715 ************************************ 00:07:49.715 END TEST version 00:07:49.715 ************************************ 00:07:49.974 11:44:16 -- spdk/autotest.sh@184 -- # '[' 1 -eq 1 ']' 00:07:49.974 11:44:16 -- spdk/autotest.sh@185 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:49.974 11:44:16 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:49.974 11:44:16 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:49.974 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:07:49.974 ************************************ 00:07:49.974 START TEST blockdev_general 00:07:49.974 ************************************ 00:07:49.974 11:44:16 blockdev_general -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:49.974 * Looking for test storage... 00:07:49.974 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:49.974 11:44:16 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:49.974 11:44:16 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:49.974 11:44:16 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:49.974 11:44:16 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:49.974 11:44:16 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1645000 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:49.975 11:44:16 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1645000 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@827 -- # '[' -z 1645000 ']' 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:49.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:49.975 11:44:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:49.975 [2024-05-14 11:44:17.038908] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:49.975 [2024-05-14 11:44:17.038980] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645000 ] 00:07:50.233 [2024-05-14 11:44:17.160153] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.233 [2024-05-14 11:44:17.257463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.169 11:44:17 blockdev_general -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:51.169 11:44:17 blockdev_general -- common/autotest_common.sh@860 -- # return 0 00:07:51.169 11:44:17 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:07:51.169 11:44:17 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:07:51.169 11:44:17 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:51.169 11:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.169 11:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.169 [2024-05-14 11:44:18.131837] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:51.169 [2024-05-14 11:44:18.131893] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:51.169 00:07:51.169 [2024-05-14 11:44:18.139817] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:51.169 [2024-05-14 11:44:18.139842] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:51.169 00:07:51.169 Malloc0 00:07:51.169 Malloc1 00:07:51.169 Malloc2 00:07:51.169 Malloc3 00:07:51.169 Malloc4 00:07:51.169 Malloc5 00:07:51.169 Malloc6 00:07:51.428 Malloc7 00:07:51.428 Malloc8 00:07:51.428 Malloc9 00:07:51.428 [2024-05-14 11:44:18.287011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:51.428 [2024-05-14 11:44:18.287061] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:51.428 [2024-05-14 11:44:18.287079] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1778450 00:07:51.428 [2024-05-14 11:44:18.287092] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:51.428 [2024-05-14 11:44:18.288460] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:51.428 [2024-05-14 11:44:18.288490] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:51.428 TestPT 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:51.428 5000+0 records in 00:07:51.428 5000+0 records out 00:07:51.428 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0273858 s, 374 MB/s 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.428 AIO0 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:51.428 11:44:18 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:07:51.428 11:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:51.689 11:44:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:51.689 11:44:18 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:07:51.689 11:44:18 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:07:51.690 11:44:18 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f7239e85-b7ef-49ca-aead-d105333099ed"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7239e85-b7ef-49ca-aead-d105333099ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "611cfa7b-6fda-5561-a335-d47b22a64e49"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "611cfa7b-6fda-5561-a335-d47b22a64e49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "be6136e7-4455-5283-88e8-47b4a8d09173"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "be6136e7-4455-5283-88e8-47b4a8d09173",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "965d7fe7-2496-51d1-96f7-650761f32f9f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "965d7fe7-2496-51d1-96f7-650761f32f9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a01f5184-266b-569e-93f8-d53145bd9bca"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a01f5184-266b-569e-93f8-d53145bd9bca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "bcc0a70b-aed3-5507-9607-8e703ae1685d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bcc0a70b-aed3-5507-9607-8e703ae1685d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "4b0ea32a-d9a7-5206-985c-61f446a796bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4b0ea32a-d9a7-5206-985c-61f446a796bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "44aee8c5-eec9-56ad-9dc6-b2ceca842e38"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44aee8c5-eec9-56ad-9dc6-b2ceca842e38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "aff2c8e0-68cb-5368-acdd-3130eaedaf94"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "aff2c8e0-68cb-5368-acdd-3130eaedaf94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3110a40c-a829-52d3-be69-007b7acf9797"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3110a40c-a829-52d3-be69-007b7acf9797",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2e095030-0f4a-592b-841d-2233c2f4cc2f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2e095030-0f4a-592b-841d-2233c2f4cc2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "c0a34f8a-3f24-43ca-ae72-98075203d2a6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "28759b12-dccd-402e-bcec-6a9cff0bf5be",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e078e1e8-e42f-4de0-827b-ec2be66dc651",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a7bb643b-c1a6-4697-98ab-8738de6ebe44"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7f2c24c5-109d-4caf-bfb1-af251ab59985",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "305f0226-469e-4f8f-95e8-4a91f6f1c175",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "772e922b-3284-4c2f-bce9-1cd597d49835"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "df7e6225-4923-4189-bc25-8fcff002d358",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d8b78ed1-32fd-4bc5-a921-80fe8bccd7e3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c2e88c3f-e6b4-4576-88c9-38cb1a20109e"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c2e88c3f-e6b4-4576-88c9-38cb1a20109e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:51.690 11:44:18 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:07:51.690 11:44:18 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:07:51.690 11:44:18 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:07:51.690 11:44:18 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1645000 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@946 -- # '[' -z 1645000 ']' 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@950 -- # kill -0 1645000 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@951 -- # uname 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1645000 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1645000' 00:07:51.690 killing process with pid 1645000 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@965 -- # kill 1645000 00:07:51.690 11:44:18 blockdev_general -- common/autotest_common.sh@970 -- # wait 1645000 00:07:52.258 11:44:19 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:52.258 11:44:19 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:52.258 11:44:19 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:52.258 11:44:19 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:52.258 11:44:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:52.258 ************************************ 00:07:52.258 START TEST bdev_hello_world 00:07:52.258 ************************************ 00:07:52.258 11:44:19 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:52.517 [2024-05-14 11:44:19.362789] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:52.517 [2024-05-14 11:44:19.362834] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645299 ] 00:07:52.517 [2024-05-14 11:44:19.474218] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.517 [2024-05-14 11:44:19.573812] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.776 [2024-05-14 11:44:19.737858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:52.776 [2024-05-14 11:44:19.737933] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:52.776 [2024-05-14 11:44:19.737948] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:52.776 [2024-05-14 11:44:19.745859] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:52.776 [2024-05-14 11:44:19.745885] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:52.776 [2024-05-14 11:44:19.753874] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:52.776 [2024-05-14 11:44:19.753898] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:52.776 [2024-05-14 11:44:19.830926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:52.776 [2024-05-14 11:44:19.830982] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:52.776 [2024-05-14 11:44:19.830999] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcab2f0 00:07:52.776 [2024-05-14 11:44:19.831013] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:52.776 [2024-05-14 11:44:19.832505] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:52.776 [2024-05-14 11:44:19.832534] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:53.034 [2024-05-14 11:44:19.975859] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:53.034 [2024-05-14 11:44:19.975930] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:53.034 [2024-05-14 11:44:19.975985] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:53.034 [2024-05-14 11:44:19.976061] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:53.034 [2024-05-14 11:44:19.976137] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:53.034 [2024-05-14 11:44:19.976167] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:53.034 [2024-05-14 11:44:19.976230] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:53.034 00:07:53.034 [2024-05-14 11:44:19.976271] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:53.293 00:07:53.293 real 0m1.001s 00:07:53.293 user 0m0.655s 00:07:53.293 sys 0m0.311s 00:07:53.293 11:44:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:53.293 11:44:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:53.293 ************************************ 00:07:53.293 END TEST bdev_hello_world 00:07:53.293 ************************************ 00:07:53.293 11:44:20 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:07:53.293 11:44:20 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:07:53.293 11:44:20 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:53.293 11:44:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:53.552 ************************************ 00:07:53.552 START TEST bdev_bounds 00:07:53.552 ************************************ 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1645406 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1645406' 00:07:53.552 Process bdevio pid: 1645406 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1645406 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 1645406 ']' 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:53.552 11:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:53.552 [2024-05-14 11:44:20.469368] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:53.552 [2024-05-14 11:44:20.469444] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645406 ] 00:07:53.552 [2024-05-14 11:44:20.601018] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:53.810 [2024-05-14 11:44:20.707648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.810 [2024-05-14 11:44:20.707733] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:53.810 [2024-05-14 11:44:20.707737] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.810 [2024-05-14 11:44:20.874188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:53.810 [2024-05-14 11:44:20.874255] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:53.810 [2024-05-14 11:44:20.874271] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:53.810 [2024-05-14 11:44:20.882201] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:53.810 [2024-05-14 11:44:20.882230] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:53.810 [2024-05-14 11:44:20.890210] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:53.810 [2024-05-14 11:44:20.890236] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:54.068 [2024-05-14 11:44:20.967696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:54.068 [2024-05-14 11:44:20.967753] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:54.068 [2024-05-14 11:44:20.967772] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13fc5b0 00:07:54.068 [2024-05-14 11:44:20.967784] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:54.068 [2024-05-14 11:44:20.969528] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:54.068 [2024-05-14 11:44:20.969558] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:54.326 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:54.326 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:07:54.326 11:44:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:54.586 I/O targets: 00:07:54.586 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:54.586 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:54.586 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:54.586 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:54.586 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:54.586 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:54.586 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:54.586 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:54.586 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:54.586 00:07:54.586 00:07:54.586 CUnit - A unit testing framework for C - Version 2.1-3 00:07:54.586 http://cunit.sourceforge.net/ 00:07:54.586 00:07:54.586 00:07:54.586 Suite: bdevio tests on: AIO0 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.586 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.586 Test: blockdev writev readv block ...passed 00:07:54.586 Test: blockdev writev readv size > 128k ...passed 00:07:54.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.586 Test: blockdev comparev and writev ...passed 00:07:54.586 Test: blockdev nvme passthru rw ...passed 00:07:54.586 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.586 Test: blockdev nvme admin passthru ...passed 00:07:54.586 Test: blockdev copy ...passed 00:07:54.586 Suite: bdevio tests on: raid1 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.586 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.586 Test: blockdev writev readv block ...passed 00:07:54.586 Test: blockdev writev readv size > 128k ...passed 00:07:54.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.586 Test: blockdev comparev and writev ...passed 00:07:54.586 Test: blockdev nvme passthru rw ...passed 00:07:54.586 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.586 Test: blockdev nvme admin passthru ...passed 00:07:54.586 Test: blockdev copy ...passed 00:07:54.586 Suite: bdevio tests on: concat0 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.586 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.586 Test: blockdev writev readv block ...passed 00:07:54.586 Test: blockdev writev readv size > 128k ...passed 00:07:54.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.586 Test: blockdev comparev and writev ...passed 00:07:54.586 Test: blockdev nvme passthru rw ...passed 00:07:54.586 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.586 Test: blockdev nvme admin passthru ...passed 00:07:54.586 Test: blockdev copy ...passed 00:07:54.586 Suite: bdevio tests on: raid0 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.586 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.586 Test: blockdev writev readv block ...passed 00:07:54.586 Test: blockdev writev readv size > 128k ...passed 00:07:54.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.586 Test: blockdev comparev and writev ...passed 00:07:54.586 Test: blockdev nvme passthru rw ...passed 00:07:54.586 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.586 Test: blockdev nvme admin passthru ...passed 00:07:54.586 Test: blockdev copy ...passed 00:07:54.586 Suite: bdevio tests on: TestPT 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.586 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.586 Test: blockdev writev readv block ...passed 00:07:54.586 Test: blockdev writev readv size > 128k ...passed 00:07:54.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.586 Test: blockdev comparev and writev ...passed 00:07:54.586 Test: blockdev nvme passthru rw ...passed 00:07:54.586 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.586 Test: blockdev nvme admin passthru ...passed 00:07:54.586 Test: blockdev copy ...passed 00:07:54.586 Suite: bdevio tests on: Malloc2p7 00:07:54.586 Test: blockdev write read block ...passed 00:07:54.586 Test: blockdev write zeroes read block ...passed 00:07:54.586 Test: blockdev write zeroes read no split ...passed 00:07:54.586 Test: blockdev write zeroes read split ...passed 00:07:54.586 Test: blockdev write zeroes read split partial ...passed 00:07:54.586 Test: blockdev reset ...passed 00:07:54.586 Test: blockdev write read 8 blocks ...passed 00:07:54.586 Test: blockdev write read size > 128k ...passed 00:07:54.586 Test: blockdev write read invalid size ...passed 00:07:54.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.586 Test: blockdev write read max offset ...passed 00:07:54.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.586 Test: blockdev writev readv 8 blocks ...passed 00:07:54.587 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.587 Test: blockdev writev readv block ...passed 00:07:54.587 Test: blockdev writev readv size > 128k ...passed 00:07:54.587 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.587 Test: blockdev comparev and writev ...passed 00:07:54.587 Test: blockdev nvme passthru rw ...passed 00:07:54.587 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.587 Test: blockdev nvme admin passthru ...passed 00:07:54.587 Test: blockdev copy ...passed 00:07:54.587 Suite: bdevio tests on: Malloc2p6 00:07:54.587 Test: blockdev write read block ...passed 00:07:54.587 Test: blockdev write zeroes read block ...passed 00:07:54.587 Test: blockdev write zeroes read no split ...passed 00:07:54.587 Test: blockdev write zeroes read split ...passed 00:07:54.587 Test: blockdev write zeroes read split partial ...passed 00:07:54.587 Test: blockdev reset ...passed 00:07:54.587 Test: blockdev write read 8 blocks ...passed 00:07:54.587 Test: blockdev write read size > 128k ...passed 00:07:54.587 Test: blockdev write read invalid size ...passed 00:07:54.587 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.587 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.587 Test: blockdev write read max offset ...passed 00:07:54.587 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.587 Test: blockdev writev readv 8 blocks ...passed 00:07:54.587 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.587 Test: blockdev writev readv block ...passed 00:07:54.587 Test: blockdev writev readv size > 128k ...passed 00:07:54.587 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.587 Test: blockdev comparev and writev ...passed 00:07:54.587 Test: blockdev nvme passthru rw ...passed 00:07:54.587 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.587 Test: blockdev nvme admin passthru ...passed 00:07:54.587 Test: blockdev copy ...passed 00:07:54.587 Suite: bdevio tests on: Malloc2p5 00:07:54.587 Test: blockdev write read block ...passed 00:07:54.587 Test: blockdev write zeroes read block ...passed 00:07:54.587 Test: blockdev write zeroes read no split ...passed 00:07:54.587 Test: blockdev write zeroes read split ...passed 00:07:54.587 Test: blockdev write zeroes read split partial ...passed 00:07:54.587 Test: blockdev reset ...passed 00:07:54.587 Test: blockdev write read 8 blocks ...passed 00:07:54.587 Test: blockdev write read size > 128k ...passed 00:07:54.587 Test: blockdev write read invalid size ...passed 00:07:54.587 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.587 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.587 Test: blockdev write read max offset ...passed 00:07:54.587 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.587 Test: blockdev writev readv 8 blocks ...passed 00:07:54.587 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.587 Test: blockdev writev readv block ...passed 00:07:54.587 Test: blockdev writev readv size > 128k ...passed 00:07:54.587 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.587 Test: blockdev comparev and writev ...passed 00:07:54.587 Test: blockdev nvme passthru rw ...passed 00:07:54.587 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.587 Test: blockdev nvme admin passthru ...passed 00:07:54.587 Test: blockdev copy ...passed 00:07:54.587 Suite: bdevio tests on: Malloc2p4 00:07:54.587 Test: blockdev write read block ...passed 00:07:54.587 Test: blockdev write zeroes read block ...passed 00:07:54.587 Test: blockdev write zeroes read no split ...passed 00:07:54.587 Test: blockdev write zeroes read split ...passed 00:07:54.587 Test: blockdev write zeroes read split partial ...passed 00:07:54.587 Test: blockdev reset ...passed 00:07:54.587 Test: blockdev write read 8 blocks ...passed 00:07:54.587 Test: blockdev write read size > 128k ...passed 00:07:54.587 Test: blockdev write read invalid size ...passed 00:07:54.587 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.587 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.587 Test: blockdev write read max offset ...passed 00:07:54.587 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.587 Test: blockdev writev readv 8 blocks ...passed 00:07:54.587 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.587 Test: blockdev writev readv block ...passed 00:07:54.587 Test: blockdev writev readv size > 128k ...passed 00:07:54.587 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.587 Test: blockdev comparev and writev ...passed 00:07:54.587 Test: blockdev nvme passthru rw ...passed 00:07:54.587 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.587 Test: blockdev nvme admin passthru ...passed 00:07:54.587 Test: blockdev copy ...passed 00:07:54.587 Suite: bdevio tests on: Malloc2p3 00:07:54.587 Test: blockdev write read block ...passed 00:07:54.587 Test: blockdev write zeroes read block ...passed 00:07:54.587 Test: blockdev write zeroes read no split ...passed 00:07:54.846 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc2p2 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc2p1 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc2p0 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc1p1 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc1p0 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 Suite: bdevio tests on: Malloc0 00:07:54.847 Test: blockdev write read block ...passed 00:07:54.847 Test: blockdev write zeroes read block ...passed 00:07:54.847 Test: blockdev write zeroes read no split ...passed 00:07:54.847 Test: blockdev write zeroes read split ...passed 00:07:54.847 Test: blockdev write zeroes read split partial ...passed 00:07:54.847 Test: blockdev reset ...passed 00:07:54.847 Test: blockdev write read 8 blocks ...passed 00:07:54.847 Test: blockdev write read size > 128k ...passed 00:07:54.847 Test: blockdev write read invalid size ...passed 00:07:54.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:54.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:54.847 Test: blockdev write read max offset ...passed 00:07:54.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:54.847 Test: blockdev writev readv 8 blocks ...passed 00:07:54.847 Test: blockdev writev readv 30 x 1block ...passed 00:07:54.847 Test: blockdev writev readv block ...passed 00:07:54.847 Test: blockdev writev readv size > 128k ...passed 00:07:54.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:54.847 Test: blockdev comparev and writev ...passed 00:07:54.847 Test: blockdev nvme passthru rw ...passed 00:07:54.847 Test: blockdev nvme passthru vendor specific ...passed 00:07:54.847 Test: blockdev nvme admin passthru ...passed 00:07:54.847 Test: blockdev copy ...passed 00:07:54.847 00:07:54.847 Run Summary: Type Total Ran Passed Failed Inactive 00:07:54.847 suites 16 16 n/a 0 0 00:07:54.847 tests 368 368 368 0 0 00:07:54.847 asserts 2224 2224 2224 0 n/a 00:07:54.847 00:07:54.847 Elapsed time = 0.515 seconds 00:07:54.847 0 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1645406 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 1645406 ']' 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 1645406 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:54.847 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1645406 00:07:54.848 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:54.848 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:54.848 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1645406' 00:07:54.848 killing process with pid 1645406 00:07:54.848 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@965 -- # kill 1645406 00:07:54.848 11:44:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@970 -- # wait 1645406 00:07:55.106 11:44:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:07:55.106 00:07:55.106 real 0m1.725s 00:07:55.106 user 0m4.259s 00:07:55.106 sys 0m0.508s 00:07:55.106 11:44:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:55.106 11:44:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:55.106 ************************************ 00:07:55.106 END TEST bdev_bounds 00:07:55.106 ************************************ 00:07:55.106 11:44:22 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:55.106 11:44:22 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:07:55.106 11:44:22 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:55.106 11:44:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:55.366 ************************************ 00:07:55.366 START TEST bdev_nbd 00:07:55.366 ************************************ 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1645776 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1645776 /var/tmp/spdk-nbd.sock 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 1645776 ']' 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:55.366 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:55.366 11:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:55.366 [2024-05-14 11:44:22.298710] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:07:55.366 [2024-05-14 11:44:22.298780] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:55.366 [2024-05-14 11:44:22.432369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.625 [2024-05-14 11:44:22.536231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.625 [2024-05-14 11:44:22.696727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:55.625 [2024-05-14 11:44:22.696792] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:55.625 [2024-05-14 11:44:22.696808] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:55.625 [2024-05-14 11:44:22.704737] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:55.625 [2024-05-14 11:44:22.704764] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:55.884 [2024-05-14 11:44:22.712748] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:55.884 [2024-05-14 11:44:22.712772] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:55.884 [2024-05-14 11:44:22.790043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:55.884 [2024-05-14 11:44:22.790097] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:55.884 [2024-05-14 11:44:22.790114] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1260870 00:07:55.884 [2024-05-14 11:44:22.790127] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:55.884 [2024-05-14 11:44:22.791576] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:55.884 [2024-05-14 11:44:22.791606] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:56.451 1+0 records in 00:07:56.451 1+0 records out 00:07:56.451 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261286 s, 15.7 MB/s 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:56.451 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:56.709 1+0 records in 00:07:56.709 1+0 records out 00:07:56.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258806 s, 15.8 MB/s 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:56.709 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:56.968 1+0 records in 00:07:56.968 1+0 records out 00:07:56.968 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295176 s, 13.9 MB/s 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:56.968 11:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:57.226 1+0 records in 00:07:57.226 1+0 records out 00:07:57.226 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297113 s, 13.8 MB/s 00:07:57.226 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:57.227 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:57.485 1+0 records in 00:07:57.485 1+0 records out 00:07:57.485 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332599 s, 12.3 MB/s 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:57.485 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:57.743 1+0 records in 00:07:57.743 1+0 records out 00:07:57.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000355263 s, 11.5 MB/s 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:57.743 11:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:58.001 1+0 records in 00:07:58.001 1+0 records out 00:07:58.001 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000483013 s, 8.5 MB/s 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:58.001 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.259 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:58.259 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:58.260 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:58.260 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:58.260 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:58.260 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:58.518 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:58.519 1+0 records in 00:07:58.519 1+0 records out 00:07:58.519 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000397751 s, 10.3 MB/s 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:58.519 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:58.778 1+0 records in 00:07:58.778 1+0 records out 00:07:58.778 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000455128 s, 9.0 MB/s 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:58.778 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:59.037 1+0 records in 00:07:59.037 1+0 records out 00:07:59.037 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000554467 s, 7.4 MB/s 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:59.037 11:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:59.297 1+0 records in 00:07:59.297 1+0 records out 00:07:59.297 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000589447 s, 6.9 MB/s 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:59.297 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:59.556 1+0 records in 00:07:59.556 1+0 records out 00:07:59.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00051756 s, 7.9 MB/s 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:59.556 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:59.815 1+0 records in 00:07:59.815 1+0 records out 00:07:59.815 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000611781 s, 6.7 MB/s 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:59.815 11:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:00.074 1+0 records in 00:08:00.074 1+0 records out 00:08:00.074 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00080708 s, 5.1 MB/s 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:00.074 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:00.333 1+0 records in 00:08:00.333 1+0 records out 00:08:00.333 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000640617 s, 6.4 MB/s 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:00.333 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:00.592 1+0 records in 00:08:00.592 1+0 records out 00:08:00.592 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000682572 s, 6.0 MB/s 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:00.592 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:00.851 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd0", 00:08:00.851 "bdev_name": "Malloc0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd1", 00:08:00.851 "bdev_name": "Malloc1p0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd2", 00:08:00.851 "bdev_name": "Malloc1p1" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd3", 00:08:00.851 "bdev_name": "Malloc2p0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd4", 00:08:00.851 "bdev_name": "Malloc2p1" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd5", 00:08:00.851 "bdev_name": "Malloc2p2" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd6", 00:08:00.851 "bdev_name": "Malloc2p3" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd7", 00:08:00.851 "bdev_name": "Malloc2p4" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd8", 00:08:00.851 "bdev_name": "Malloc2p5" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd9", 00:08:00.851 "bdev_name": "Malloc2p6" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd10", 00:08:00.851 "bdev_name": "Malloc2p7" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd11", 00:08:00.851 "bdev_name": "TestPT" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd12", 00:08:00.851 "bdev_name": "raid0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd13", 00:08:00.851 "bdev_name": "concat0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd14", 00:08:00.851 "bdev_name": "raid1" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd15", 00:08:00.851 "bdev_name": "AIO0" 00:08:00.851 } 00:08:00.851 ]' 00:08:00.851 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:00.851 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd0", 00:08:00.851 "bdev_name": "Malloc0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd1", 00:08:00.851 "bdev_name": "Malloc1p0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd2", 00:08:00.851 "bdev_name": "Malloc1p1" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd3", 00:08:00.851 "bdev_name": "Malloc2p0" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd4", 00:08:00.851 "bdev_name": "Malloc2p1" 00:08:00.851 }, 00:08:00.851 { 00:08:00.851 "nbd_device": "/dev/nbd5", 00:08:00.851 "bdev_name": "Malloc2p2" 00:08:00.851 }, 00:08:00.851 { 00:08:00.852 "nbd_device": "/dev/nbd6", 00:08:00.852 "bdev_name": "Malloc2p3" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd7", 00:08:00.852 "bdev_name": "Malloc2p4" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd8", 00:08:00.852 "bdev_name": "Malloc2p5" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd9", 00:08:00.852 "bdev_name": "Malloc2p6" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd10", 00:08:00.852 "bdev_name": "Malloc2p7" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd11", 00:08:00.852 "bdev_name": "TestPT" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd12", 00:08:00.852 "bdev_name": "raid0" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd13", 00:08:00.852 "bdev_name": "concat0" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd14", 00:08:00.852 "bdev_name": "raid1" 00:08:00.852 }, 00:08:00.852 { 00:08:00.852 "nbd_device": "/dev/nbd15", 00:08:00.852 "bdev_name": "AIO0" 00:08:00.852 } 00:08:00.852 ]' 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:00.852 11:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:01.111 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:01.370 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:01.635 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:01.899 11:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:02.157 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:02.463 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:02.722 11:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:02.980 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:03.238 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:03.498 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:03.853 11:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.112 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.371 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.630 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.889 11:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:05.148 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:05.428 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:05.693 /dev/nbd0 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:05.693 1+0 records in 00:08:05.693 1+0 records out 00:08:05.693 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254172 s, 16.1 MB/s 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:05.693 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:05.952 /dev/nbd1 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:05.952 1+0 records in 00:08:05.952 1+0 records out 00:08:05.952 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278569 s, 14.7 MB/s 00:08:05.952 11:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:05.952 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:06.211 /dev/nbd10 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:06.211 1+0 records in 00:08:06.211 1+0 records out 00:08:06.211 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272822 s, 15.0 MB/s 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:06.211 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:06.470 /dev/nbd11 00:08:06.470 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:06.729 1+0 records in 00:08:06.729 1+0 records out 00:08:06.729 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345242 s, 11.9 MB/s 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:06.729 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:06.988 /dev/nbd12 00:08:06.988 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:06.988 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:06.988 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:08:06.988 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:06.988 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:06.989 1+0 records in 00:08:06.989 1+0 records out 00:08:06.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000396204 s, 10.3 MB/s 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:06.989 11:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:07.248 /dev/nbd13 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:07.248 1+0 records in 00:08:07.248 1+0 records out 00:08:07.248 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000446731 s, 9.2 MB/s 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:07.248 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:07.507 /dev/nbd14 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:07.507 1+0 records in 00:08:07.507 1+0 records out 00:08:07.507 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00047018 s, 8.7 MB/s 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:07.507 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:07.767 /dev/nbd15 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:07.767 1+0 records in 00:08:07.767 1+0 records out 00:08:07.767 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000454242 s, 9.0 MB/s 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:07.767 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:08.025 /dev/nbd2 00:08:08.025 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:08.025 11:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:08.025 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:08.026 11:44:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:08.026 1+0 records in 00:08:08.026 1+0 records out 00:08:08.026 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000563912 s, 7.3 MB/s 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:08.026 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:08.284 /dev/nbd3 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:08.284 1+0 records in 00:08:08.284 1+0 records out 00:08:08.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000467476 s, 8.8 MB/s 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.284 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:08.285 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:08.543 /dev/nbd4 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:08.543 1+0 records in 00:08:08.543 1+0 records out 00:08:08.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000570072 s, 7.2 MB/s 00:08:08.543 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:08.544 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:08.802 /dev/nbd5 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:08.802 1+0 records in 00:08:08.802 1+0 records out 00:08:08.802 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000742838 s, 5.5 MB/s 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:08.802 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:08.803 11:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:09.061 /dev/nbd6 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:09.061 1+0 records in 00:08:09.061 1+0 records out 00:08:09.061 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00076032 s, 5.4 MB/s 00:08:09.061 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:09.318 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:09.318 /dev/nbd7 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:09.576 1+0 records in 00:08:09.576 1+0 records out 00:08:09.576 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000611165 s, 6.7 MB/s 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:09.576 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:09.835 /dev/nbd8 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:09.835 1+0 records in 00:08:09.835 1+0 records out 00:08:09.835 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00075576 s, 5.4 MB/s 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:09.835 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:10.094 /dev/nbd9 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:10.094 11:44:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:10.094 1+0 records in 00:08:10.094 1+0 records out 00:08:10.094 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000860838 s, 4.8 MB/s 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:10.094 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:10.353 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd0", 00:08:10.353 "bdev_name": "Malloc0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd1", 00:08:10.353 "bdev_name": "Malloc1p0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd10", 00:08:10.353 "bdev_name": "Malloc1p1" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd11", 00:08:10.353 "bdev_name": "Malloc2p0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd12", 00:08:10.353 "bdev_name": "Malloc2p1" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd13", 00:08:10.353 "bdev_name": "Malloc2p2" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd14", 00:08:10.353 "bdev_name": "Malloc2p3" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd15", 00:08:10.353 "bdev_name": "Malloc2p4" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd2", 00:08:10.353 "bdev_name": "Malloc2p5" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd3", 00:08:10.353 "bdev_name": "Malloc2p6" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd4", 00:08:10.353 "bdev_name": "Malloc2p7" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd5", 00:08:10.353 "bdev_name": "TestPT" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd6", 00:08:10.353 "bdev_name": "raid0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd7", 00:08:10.353 "bdev_name": "concat0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd8", 00:08:10.353 "bdev_name": "raid1" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd9", 00:08:10.353 "bdev_name": "AIO0" 00:08:10.353 } 00:08:10.353 ]' 00:08:10.353 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd0", 00:08:10.353 "bdev_name": "Malloc0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd1", 00:08:10.353 "bdev_name": "Malloc1p0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd10", 00:08:10.353 "bdev_name": "Malloc1p1" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd11", 00:08:10.353 "bdev_name": "Malloc2p0" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd12", 00:08:10.353 "bdev_name": "Malloc2p1" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd13", 00:08:10.353 "bdev_name": "Malloc2p2" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd14", 00:08:10.353 "bdev_name": "Malloc2p3" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd15", 00:08:10.353 "bdev_name": "Malloc2p4" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd2", 00:08:10.353 "bdev_name": "Malloc2p5" 00:08:10.353 }, 00:08:10.353 { 00:08:10.353 "nbd_device": "/dev/nbd3", 00:08:10.353 "bdev_name": "Malloc2p6" 00:08:10.353 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd4", 00:08:10.354 "bdev_name": "Malloc2p7" 00:08:10.354 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd5", 00:08:10.354 "bdev_name": "TestPT" 00:08:10.354 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd6", 00:08:10.354 "bdev_name": "raid0" 00:08:10.354 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd7", 00:08:10.354 "bdev_name": "concat0" 00:08:10.354 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd8", 00:08:10.354 "bdev_name": "raid1" 00:08:10.354 }, 00:08:10.354 { 00:08:10.354 "nbd_device": "/dev/nbd9", 00:08:10.354 "bdev_name": "AIO0" 00:08:10.354 } 00:08:10.354 ]' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:10.354 /dev/nbd1 00:08:10.354 /dev/nbd10 00:08:10.354 /dev/nbd11 00:08:10.354 /dev/nbd12 00:08:10.354 /dev/nbd13 00:08:10.354 /dev/nbd14 00:08:10.354 /dev/nbd15 00:08:10.354 /dev/nbd2 00:08:10.354 /dev/nbd3 00:08:10.354 /dev/nbd4 00:08:10.354 /dev/nbd5 00:08:10.354 /dev/nbd6 00:08:10.354 /dev/nbd7 00:08:10.354 /dev/nbd8 00:08:10.354 /dev/nbd9' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:10.354 /dev/nbd1 00:08:10.354 /dev/nbd10 00:08:10.354 /dev/nbd11 00:08:10.354 /dev/nbd12 00:08:10.354 /dev/nbd13 00:08:10.354 /dev/nbd14 00:08:10.354 /dev/nbd15 00:08:10.354 /dev/nbd2 00:08:10.354 /dev/nbd3 00:08:10.354 /dev/nbd4 00:08:10.354 /dev/nbd5 00:08:10.354 /dev/nbd6 00:08:10.354 /dev/nbd7 00:08:10.354 /dev/nbd8 00:08:10.354 /dev/nbd9' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:10.354 256+0 records in 00:08:10.354 256+0 records out 00:08:10.354 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105304 s, 99.6 MB/s 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:10.354 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:10.612 256+0 records in 00:08:10.612 256+0 records out 00:08:10.612 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181778 s, 5.8 MB/s 00:08:10.612 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:10.612 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:10.870 256+0 records in 00:08:10.870 256+0 records out 00:08:10.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.180736 s, 5.8 MB/s 00:08:10.870 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:10.870 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:10.870 256+0 records in 00:08:10.870 256+0 records out 00:08:10.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182834 s, 5.7 MB/s 00:08:10.871 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:10.871 11:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:11.129 256+0 records in 00:08:11.129 256+0 records out 00:08:11.129 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.17726 s, 5.9 MB/s 00:08:11.129 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:11.129 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:11.386 256+0 records in 00:08:11.386 256+0 records out 00:08:11.386 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.175076 s, 6.0 MB/s 00:08:11.386 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:11.386 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:11.386 256+0 records in 00:08:11.386 256+0 records out 00:08:11.386 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183426 s, 5.7 MB/s 00:08:11.386 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:11.386 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:11.645 256+0 records in 00:08:11.645 256+0 records out 00:08:11.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18359 s, 5.7 MB/s 00:08:11.645 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:11.645 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:11.904 256+0 records in 00:08:11.904 256+0 records out 00:08:11.904 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183622 s, 5.7 MB/s 00:08:11.904 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:11.904 11:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:12.163 256+0 records in 00:08:12.163 256+0 records out 00:08:12.163 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183739 s, 5.7 MB/s 00:08:12.163 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.163 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:12.163 256+0 records in 00:08:12.163 256+0 records out 00:08:12.163 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18332 s, 5.7 MB/s 00:08:12.163 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.163 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:12.421 256+0 records in 00:08:12.421 256+0 records out 00:08:12.421 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181083 s, 5.8 MB/s 00:08:12.421 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.421 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:12.690 256+0 records in 00:08:12.690 256+0 records out 00:08:12.690 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183835 s, 5.7 MB/s 00:08:12.690 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.690 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:12.690 256+0 records in 00:08:12.690 256+0 records out 00:08:12.690 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184592 s, 5.7 MB/s 00:08:12.690 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.690 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:12.948 256+0 records in 00:08:12.948 256+0 records out 00:08:12.948 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184813 s, 5.7 MB/s 00:08:12.948 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:12.948 11:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:13.206 256+0 records in 00:08:13.206 256+0 records out 00:08:13.206 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188361 s, 5.6 MB/s 00:08:13.206 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:13.206 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:13.464 256+0 records in 00:08:13.464 256+0 records out 00:08:13.464 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182205 s, 5.8 MB/s 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.464 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:13.465 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:13.723 11:44:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:13.981 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:14.239 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:14.498 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:14.756 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:15.014 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:15.014 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:15.014 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:15.014 11:44:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:15.272 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:15.530 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:15.788 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:16.046 11:44:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:16.304 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:16.305 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:16.305 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:16.563 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:16.821 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:17.080 11:44:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:17.339 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:17.598 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:17.857 11:44:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:18.116 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:18.375 malloc_lvol_verify 00:08:18.375 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:18.634 4956b17b-abed-4ad5-bd21-36a88cc06f53 00:08:18.634 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:18.893 9e11f19f-1008-4bf2-998f-bf537a8b65c8 00:08:18.893 11:44:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:19.152 /dev/nbd0 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:19.152 mke2fs 1.46.5 (30-Dec-2021) 00:08:19.152 Discarding device blocks: 0/4096 done 00:08:19.152 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:19.152 00:08:19.152 Allocating group tables: 0/1 done 00:08:19.152 Writing inode tables: 0/1 done 00:08:19.152 Creating journal (1024 blocks): done 00:08:19.152 Writing superblocks and filesystem accounting information: 0/1 done 00:08:19.152 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:19.152 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1645776 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 1645776 ']' 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 1645776 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1645776 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1645776' 00:08:19.411 killing process with pid 1645776 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@965 -- # kill 1645776 00:08:19.411 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@970 -- # wait 1645776 00:08:19.980 11:44:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:19.980 00:08:19.980 real 0m24.566s 00:08:19.980 user 0m29.969s 00:08:19.980 sys 0m14.266s 00:08:19.980 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:19.980 11:44:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:19.980 ************************************ 00:08:19.980 END TEST bdev_nbd 00:08:19.980 ************************************ 00:08:19.980 11:44:46 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:19.980 11:44:46 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:19.980 11:44:46 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:19.980 11:44:46 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:19.980 11:44:46 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:19.980 11:44:46 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:19.980 11:44:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:19.980 ************************************ 00:08:19.980 START TEST bdev_fio 00:08:19.980 ************************************ 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:19.980 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:19.980 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:19.981 11:44:46 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:19.981 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:19.981 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:19.981 11:44:46 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:19.981 ************************************ 00:08:19.981 START TEST bdev_fio_rw_verify 00:08:19.981 ************************************ 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:08:19.981 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:20.238 11:44:47 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:20.496 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:20.496 fio-3.35 00:08:20.496 Starting 16 threads 00:08:32.690 00:08:32.690 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1649792: Tue May 14 11:44:58 2024 00:08:32.690 read: IOPS=85.2k, BW=333MiB/s (349MB/s)(3328MiB/10001msec) 00:08:32.690 slat (usec): min=3, max=224, avg=37.11, stdev=14.12 00:08:32.690 clat (usec): min=11, max=1074, avg=310.69, stdev=131.88 00:08:32.690 lat (usec): min=18, max=1142, avg=347.80, stdev=138.94 00:08:32.690 clat percentiles (usec): 00:08:32.690 | 50.000th=[ 306], 99.000th=[ 594], 99.900th=[ 652], 99.990th=[ 807], 00:08:32.690 | 99.999th=[ 914] 00:08:32.690 write: IOPS=134k, BW=523MiB/s (549MB/s)(5164MiB/9870msec); 0 zone resets 00:08:32.690 slat (usec): min=8, max=4148, avg=50.83, stdev=14.68 00:08:32.690 clat (usec): min=12, max=4622, avg=370.33, stdev=155.11 00:08:32.690 lat (usec): min=40, max=4677, avg=421.16, stdev=161.36 00:08:32.690 clat percentiles (usec): 00:08:32.690 | 50.000th=[ 359], 99.000th=[ 725], 99.900th=[ 857], 99.990th=[ 963], 00:08:32.690 | 99.999th=[ 1156] 00:08:32.690 bw ( KiB/s): min=463896, max=610731, per=98.72%, avg=528888.11, stdev=2596.19, samples=304 00:08:32.690 iops : min=115974, max=152680, avg=132221.79, stdev=649.03, samples=304 00:08:32.690 lat (usec) : 20=0.01%, 50=0.23%, 100=2.85%, 250=26.71%, 500=53.26% 00:08:32.690 lat (usec) : 750=16.54%, 1000=0.41% 00:08:32.690 lat (msec) : 2=0.01%, 10=0.01% 00:08:32.690 cpu : usr=99.26%, sys=0.33%, ctx=645, majf=0, minf=2103 00:08:32.690 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:32.690 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:32.690 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:32.690 issued rwts: total=852021,1321987,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:32.690 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:32.690 00:08:32.690 Run status group 0 (all jobs): 00:08:32.690 READ: bw=333MiB/s (349MB/s), 333MiB/s-333MiB/s (349MB/s-349MB/s), io=3328MiB (3490MB), run=10001-10001msec 00:08:32.690 WRITE: bw=523MiB/s (549MB/s), 523MiB/s-523MiB/s (549MB/s-549MB/s), io=5164MiB (5415MB), run=9870-9870msec 00:08:32.690 00:08:32.690 real 0m11.725s 00:08:32.690 user 2m45.284s 00:08:32.690 sys 0m1.148s 00:08:32.690 11:44:58 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:32.690 11:44:58 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:32.690 ************************************ 00:08:32.690 END TEST bdev_fio_rw_verify 00:08:32.690 ************************************ 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:08:32.690 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:32.692 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f7239e85-b7ef-49ca-aead-d105333099ed"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7239e85-b7ef-49ca-aead-d105333099ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "611cfa7b-6fda-5561-a335-d47b22a64e49"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "611cfa7b-6fda-5561-a335-d47b22a64e49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "be6136e7-4455-5283-88e8-47b4a8d09173"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "be6136e7-4455-5283-88e8-47b4a8d09173",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "965d7fe7-2496-51d1-96f7-650761f32f9f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "965d7fe7-2496-51d1-96f7-650761f32f9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a01f5184-266b-569e-93f8-d53145bd9bca"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a01f5184-266b-569e-93f8-d53145bd9bca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "bcc0a70b-aed3-5507-9607-8e703ae1685d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bcc0a70b-aed3-5507-9607-8e703ae1685d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "4b0ea32a-d9a7-5206-985c-61f446a796bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4b0ea32a-d9a7-5206-985c-61f446a796bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "44aee8c5-eec9-56ad-9dc6-b2ceca842e38"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44aee8c5-eec9-56ad-9dc6-b2ceca842e38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "aff2c8e0-68cb-5368-acdd-3130eaedaf94"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "aff2c8e0-68cb-5368-acdd-3130eaedaf94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3110a40c-a829-52d3-be69-007b7acf9797"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3110a40c-a829-52d3-be69-007b7acf9797",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2e095030-0f4a-592b-841d-2233c2f4cc2f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2e095030-0f4a-592b-841d-2233c2f4cc2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "c0a34f8a-3f24-43ca-ae72-98075203d2a6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "28759b12-dccd-402e-bcec-6a9cff0bf5be",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e078e1e8-e42f-4de0-827b-ec2be66dc651",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a7bb643b-c1a6-4697-98ab-8738de6ebe44"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7f2c24c5-109d-4caf-bfb1-af251ab59985",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "305f0226-469e-4f8f-95e8-4a91f6f1c175",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "772e922b-3284-4c2f-bce9-1cd597d49835"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "df7e6225-4923-4189-bc25-8fcff002d358",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d8b78ed1-32fd-4bc5-a921-80fe8bccd7e3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c2e88c3f-e6b4-4576-88c9-38cb1a20109e"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c2e88c3f-e6b4-4576-88c9-38cb1a20109e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:32.692 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:32.692 Malloc1p0 00:08:32.692 Malloc1p1 00:08:32.692 Malloc2p0 00:08:32.692 Malloc2p1 00:08:32.692 Malloc2p2 00:08:32.692 Malloc2p3 00:08:32.692 Malloc2p4 00:08:32.692 Malloc2p5 00:08:32.692 Malloc2p6 00:08:32.692 Malloc2p7 00:08:32.692 TestPT 00:08:32.692 raid0 00:08:32.692 concat0 ]] 00:08:32.692 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "f7239e85-b7ef-49ca-aead-d105333099ed"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7239e85-b7ef-49ca-aead-d105333099ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "611cfa7b-6fda-5561-a335-d47b22a64e49"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "611cfa7b-6fda-5561-a335-d47b22a64e49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "be6136e7-4455-5283-88e8-47b4a8d09173"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "be6136e7-4455-5283-88e8-47b4a8d09173",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "965d7fe7-2496-51d1-96f7-650761f32f9f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "965d7fe7-2496-51d1-96f7-650761f32f9f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a01f5184-266b-569e-93f8-d53145bd9bca"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a01f5184-266b-569e-93f8-d53145bd9bca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "bcc0a70b-aed3-5507-9607-8e703ae1685d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bcc0a70b-aed3-5507-9607-8e703ae1685d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "4b0ea32a-d9a7-5206-985c-61f446a796bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4b0ea32a-d9a7-5206-985c-61f446a796bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1b594e1d-b1b7-54c0-bf83-7f0a6856c2e7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "44aee8c5-eec9-56ad-9dc6-b2ceca842e38"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "44aee8c5-eec9-56ad-9dc6-b2ceca842e38",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "aff2c8e0-68cb-5368-acdd-3130eaedaf94"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "aff2c8e0-68cb-5368-acdd-3130eaedaf94",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "3110a40c-a829-52d3-be69-007b7acf9797"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3110a40c-a829-52d3-be69-007b7acf9797",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2e095030-0f4a-592b-841d-2233c2f4cc2f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2e095030-0f4a-592b-841d-2233c2f4cc2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "c0a34f8a-3f24-43ca-ae72-98075203d2a6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0a34f8a-3f24-43ca-ae72-98075203d2a6",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "28759b12-dccd-402e-bcec-6a9cff0bf5be",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e078e1e8-e42f-4de0-827b-ec2be66dc651",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a7bb643b-c1a6-4697-98ab-8738de6ebe44"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a7bb643b-c1a6-4697-98ab-8738de6ebe44",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7f2c24c5-109d-4caf-bfb1-af251ab59985",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "305f0226-469e-4f8f-95e8-4a91f6f1c175",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "772e922b-3284-4c2f-bce9-1cd597d49835"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "772e922b-3284-4c2f-bce9-1cd597d49835",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "df7e6225-4923-4189-bc25-8fcff002d358",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d8b78ed1-32fd-4bc5-a921-80fe8bccd7e3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c2e88c3f-e6b4-4576-88c9-38cb1a20109e"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c2e88c3f-e6b4-4576-88c9-38cb1a20109e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:32.693 11:44:58 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:32.693 ************************************ 00:08:32.693 START TEST bdev_fio_trim 00:08:32.693 ************************************ 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:08:32.693 11:44:58 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:32.693 11:44:59 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:32.693 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:32.694 fio-3.35 00:08:32.694 Starting 14 threads 00:08:44.897 00:08:44.897 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1651494: Tue May 14 11:45:10 2024 00:08:44.897 write: IOPS=116k, BW=454MiB/s (476MB/s)(4539MiB/10001msec); 0 zone resets 00:08:44.897 slat (usec): min=5, max=439, avg=41.21, stdev=10.61 00:08:44.897 clat (usec): min=34, max=817, avg=306.12, stdev=98.64 00:08:44.897 lat (usec): min=52, max=873, avg=347.33, stdev=101.96 00:08:44.897 clat percentiles (usec): 00:08:44.897 | 50.000th=[ 297], 99.000th=[ 519], 99.900th=[ 562], 99.990th=[ 586], 00:08:44.897 | 99.999th=[ 676] 00:08:44.897 bw ( KiB/s): min=450720, max=488195, per=100.00%, avg=465027.42, stdev=798.06, samples=266 00:08:44.897 iops : min=112680, max=122048, avg=116256.79, stdev=199.51, samples=266 00:08:44.897 trim: IOPS=116k, BW=454MiB/s (476MB/s)(4539MiB/10001msec); 0 zone resets 00:08:44.897 slat (usec): min=4, max=143, avg=28.31, stdev= 7.25 00:08:44.897 clat (usec): min=27, max=874, avg=346.31, stdev=103.32 00:08:44.897 lat (usec): min=42, max=904, avg=374.62, stdev=105.77 00:08:44.897 clat percentiles (usec): 00:08:44.897 | 50.000th=[ 338], 99.000th=[ 570], 99.900th=[ 611], 99.990th=[ 644], 00:08:44.897 | 99.999th=[ 701] 00:08:44.897 bw ( KiB/s): min=450720, max=488195, per=100.00%, avg=465027.84, stdev=798.02, samples=266 00:08:44.897 iops : min=112680, max=122048, avg=116256.89, stdev=199.50, samples=266 00:08:44.897 lat (usec) : 50=0.01%, 100=0.42%, 250=25.63%, 500=68.95%, 750=4.98% 00:08:44.897 lat (usec) : 1000=0.01% 00:08:44.897 cpu : usr=99.61%, sys=0.00%, ctx=416, majf=0, minf=27 00:08:44.897 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:44.897 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:44.897 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:44.897 issued rwts: total=0,1161992,1161994,0 short=0,0,0,0 dropped=0,0,0,0 00:08:44.897 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:44.897 00:08:44.897 Run status group 0 (all jobs): 00:08:44.897 WRITE: bw=454MiB/s (476MB/s), 454MiB/s-454MiB/s (476MB/s-476MB/s), io=4539MiB (4760MB), run=10001-10001msec 00:08:44.897 TRIM: bw=454MiB/s (476MB/s), 454MiB/s-454MiB/s (476MB/s-476MB/s), io=4539MiB (4760MB), run=10001-10001msec 00:08:44.897 00:08:44.897 real 0m11.649s 00:08:44.897 user 2m25.573s 00:08:44.897 sys 0m0.679s 00:08:44.897 11:45:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:44.897 11:45:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:44.897 ************************************ 00:08:44.897 END TEST bdev_fio_trim 00:08:44.897 ************************************ 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:08:44.897 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:08:44.897 00:08:44.897 real 0m23.783s 00:08:44.897 user 5m11.069s 00:08:44.897 sys 0m2.045s 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:44.897 11:45:10 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:44.897 ************************************ 00:08:44.897 END TEST bdev_fio 00:08:44.897 ************************************ 00:08:44.897 11:45:10 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:44.897 11:45:10 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:44.897 11:45:10 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:08:44.897 11:45:10 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:44.897 11:45:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:44.897 ************************************ 00:08:44.897 START TEST bdev_verify 00:08:44.897 ************************************ 00:08:44.897 11:45:10 blockdev_general.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:44.897 [2024-05-14 11:45:10.827343] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:08:44.897 [2024-05-14 11:45:10.827420] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653451 ] 00:08:44.897 [2024-05-14 11:45:10.956896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:44.897 [2024-05-14 11:45:11.058594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:44.897 [2024-05-14 11:45:11.058598] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.897 [2024-05-14 11:45:11.219384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:44.897 [2024-05-14 11:45:11.219448] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:44.897 [2024-05-14 11:45:11.219464] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:44.897 [2024-05-14 11:45:11.227391] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:44.897 [2024-05-14 11:45:11.227429] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:44.897 [2024-05-14 11:45:11.235412] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:44.897 [2024-05-14 11:45:11.235439] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:44.897 [2024-05-14 11:45:11.313208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:44.897 [2024-05-14 11:45:11.313262] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:44.897 [2024-05-14 11:45:11.313284] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13d3a90 00:08:44.898 [2024-05-14 11:45:11.313297] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:44.898 [2024-05-14 11:45:11.314851] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:44.898 [2024-05-14 11:45:11.314887] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:44.898 Running I/O for 5 seconds... 00:08:50.245 00:08:50.245 Latency(us) 00:08:50.245 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:50.245 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x1000 00:08:50.245 Malloc0 : 5.21 1080.43 4.22 0.00 0.00 118250.81 594.81 238892.97 00:08:50.245 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x1000 length 0x1000 00:08:50.245 Malloc0 : 5.20 1058.19 4.13 0.00 0.00 120730.13 537.82 382958.19 00:08:50.245 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x800 00:08:50.245 Malloc1p0 : 5.26 559.51 2.19 0.00 0.00 227651.80 3561.74 229774.91 00:08:50.245 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x800 length 0x800 00:08:50.245 Malloc1p0 : 5.25 560.50 2.19 0.00 0.00 227296.70 3647.22 214274.23 00:08:50.245 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x800 00:08:50.245 Malloc1p1 : 5.27 559.15 2.18 0.00 0.00 227195.79 3604.48 227951.30 00:08:50.245 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x800 length 0x800 00:08:50.245 Malloc1p1 : 5.25 560.26 2.19 0.00 0.00 226752.11 3632.97 211538.81 00:08:50.245 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x200 00:08:50.245 Malloc2p0 : 5.27 558.78 2.18 0.00 0.00 226689.32 3647.22 224304.08 00:08:50.245 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x200 length 0x200 00:08:50.245 Malloc2p0 : 5.26 560.01 2.19 0.00 0.00 226183.47 3604.48 207891.59 00:08:50.245 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x200 00:08:50.245 Malloc2p1 : 5.27 558.38 2.18 0.00 0.00 226194.06 3575.99 220656.86 00:08:50.245 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x200 length 0x200 00:08:50.245 Malloc2p1 : 5.26 559.76 2.19 0.00 0.00 225645.56 3561.74 205156.17 00:08:50.245 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.245 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p2 : 5.28 557.99 2.18 0.00 0.00 225715.53 3618.73 214274.23 00:08:50.246 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p2 : 5.26 559.48 2.19 0.00 0.00 225119.88 3604.48 201508.95 00:08:50.246 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p3 : 5.28 557.60 2.18 0.00 0.00 225242.56 3547.49 210627.01 00:08:50.246 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p3 : 5.27 559.12 2.18 0.00 0.00 224646.31 3575.99 196949.93 00:08:50.246 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p4 : 5.28 557.23 2.18 0.00 0.00 224808.25 3490.50 206979.78 00:08:50.246 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p4 : 5.27 558.75 2.18 0.00 0.00 224193.84 3476.26 193302.71 00:08:50.246 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p5 : 5.29 556.98 2.18 0.00 0.00 224333.06 3519.00 200597.15 00:08:50.246 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p5 : 5.27 558.36 2.18 0.00 0.00 223798.04 3476.26 189655.49 00:08:50.246 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p6 : 5.29 556.68 2.17 0.00 0.00 223853.96 3447.76 199685.34 00:08:50.246 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p6 : 5.28 557.96 2.18 0.00 0.00 223363.66 3447.76 185096.46 00:08:50.246 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x200 00:08:50.246 Malloc2p7 : 5.29 556.38 2.17 0.00 0.00 223388.05 3376.53 197861.73 00:08:50.246 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x200 length 0x200 00:08:50.246 Malloc2p7 : 5.28 557.57 2.18 0.00 0.00 222942.13 3390.78 183272.85 00:08:50.246 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x1000 00:08:50.246 TestPT : 5.29 533.34 2.08 0.00 0.00 230415.87 21769.35 198773.54 00:08:50.246 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x1000 length 0x1000 00:08:50.246 TestPT : 5.30 535.34 2.09 0.00 0.00 230990.83 11910.46 268070.73 00:08:50.246 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x2000 00:08:50.246 raid0 : 5.30 555.84 2.17 0.00 0.00 221998.58 3276.80 174154.80 00:08:50.246 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x2000 length 0x2000 00:08:50.246 raid0 : 5.29 557.04 2.18 0.00 0.00 221578.88 3291.05 158654.11 00:08:50.246 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x2000 00:08:50.246 concat0 : 5.30 555.60 2.17 0.00 0.00 221492.87 3262.55 171419.38 00:08:50.246 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x2000 length 0x2000 00:08:50.246 concat0 : 5.29 556.74 2.17 0.00 0.00 221119.39 3248.31 167772.16 00:08:50.246 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x1000 00:08:50.246 raid1 : 5.30 555.39 2.17 0.00 0.00 220979.21 3903.67 177802.02 00:08:50.246 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x1000 length 0x1000 00:08:50.246 raid1 : 5.29 556.44 2.17 0.00 0.00 220631.57 4017.64 175978.41 00:08:50.246 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x0 length 0x4e2 00:08:50.246 AIO0 : 5.30 555.23 2.17 0.00 0.00 220355.63 1460.31 187831.87 00:08:50.246 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:50.246 Verification LBA range: start 0x4e2 length 0x4e2 00:08:50.246 AIO0 : 5.29 556.20 2.17 0.00 0.00 220086.97 1446.07 183272.85 00:08:50.246 =================================================================================================================== 00:08:50.246 Total : 18826.23 73.54 0.00 0.00 212689.01 537.82 382958.19 00:08:50.505 00:08:50.505 real 0m6.563s 00:08:50.505 user 0m12.134s 00:08:50.505 sys 0m0.419s 00:08:50.505 11:45:17 blockdev_general.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:50.505 11:45:17 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:50.505 ************************************ 00:08:50.505 END TEST bdev_verify 00:08:50.505 ************************************ 00:08:50.505 11:45:17 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:50.505 11:45:17 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:08:50.505 11:45:17 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:50.505 11:45:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:50.505 ************************************ 00:08:50.505 START TEST bdev_verify_big_io 00:08:50.505 ************************************ 00:08:50.505 11:45:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:50.505 [2024-05-14 11:45:17.483977] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:08:50.505 [2024-05-14 11:45:17.484043] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654346 ] 00:08:50.764 [2024-05-14 11:45:17.613942] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:50.764 [2024-05-14 11:45:17.718675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:50.764 [2024-05-14 11:45:17.718679] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.022 [2024-05-14 11:45:17.880159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:51.022 [2024-05-14 11:45:17.880223] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:51.022 [2024-05-14 11:45:17.880237] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:51.022 [2024-05-14 11:45:17.888168] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:51.022 [2024-05-14 11:45:17.888194] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:51.022 [2024-05-14 11:45:17.896179] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:51.022 [2024-05-14 11:45:17.896202] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:51.022 [2024-05-14 11:45:17.973815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:51.022 [2024-05-14 11:45:17.973870] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:51.022 [2024-05-14 11:45:17.973889] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeaaa90 00:08:51.022 [2024-05-14 11:45:17.973902] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:51.022 [2024-05-14 11:45:17.975467] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:51.022 [2024-05-14 11:45:17.975497] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:51.280 [2024-05-14 11:45:18.166624] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.167940] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.169886] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.171118] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.172965] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.174178] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.176010] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.177895] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.179095] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.180695] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.181641] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.183139] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.184087] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.185620] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.186573] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.188070] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:51.280 [2024-05-14 11:45:18.212049] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:51.280 [2024-05-14 11:45:18.214033] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:51.280 Running I/O for 5 seconds... 00:08:59.405 00:08:59.405 Latency(us) 00:08:59.405 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:59.405 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x0 length 0x100 00:08:59.405 Malloc0 : 5.95 150.60 9.41 0.00 0.00 833523.63 926.05 2348810.24 00:08:59.405 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x100 length 0x100 00:08:59.405 Malloc0 : 5.65 136.03 8.50 0.00 0.00 921984.61 911.81 2698943.44 00:08:59.405 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x0 length 0x80 00:08:59.405 Malloc1p0 : 6.33 68.87 4.30 0.00 0.00 1718613.52 3419.27 2801065.63 00:08:59.405 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x80 length 0x80 00:08:59.405 Malloc1p0 : 6.33 55.57 3.47 0.00 0.00 2086498.84 3063.10 3136609.95 00:08:59.405 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x0 length 0x80 00:08:59.405 Malloc1p1 : 6.72 35.71 2.23 0.00 0.00 3152894.31 2664.18 5339531.35 00:08:59.405 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.405 Verification LBA range: start 0x80 length 0x80 00:08:59.405 Malloc1p1 : 6.77 37.81 2.36 0.00 0.00 2966501.82 2692.67 5076931.45 00:08:59.405 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p0 : 6.23 23.10 1.44 0.00 0.00 1209040.20 837.01 1984088.15 00:08:59.406 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p0 : 6.24 25.65 1.60 0.00 0.00 1090361.64 826.32 1743371.58 00:08:59.406 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p1 : 6.33 25.26 1.58 0.00 0.00 1115468.68 790.71 1954910.39 00:08:59.406 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p1 : 6.24 25.64 1.60 0.00 0.00 1080073.76 801.39 1721488.25 00:08:59.406 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p2 : 6.33 25.26 1.58 0.00 0.00 1106022.83 801.39 1940321.50 00:08:59.406 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p2 : 6.34 27.77 1.74 0.00 0.00 1005124.82 808.51 1692310.48 00:08:59.406 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p3 : 6.34 25.25 1.58 0.00 0.00 1095690.05 801.39 1896554.85 00:08:59.406 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p3 : 6.34 27.76 1.73 0.00 0.00 996123.69 808.51 1670427.16 00:08:59.406 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p4 : 6.34 25.24 1.58 0.00 0.00 1085811.03 744.40 1881965.97 00:08:59.406 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p4 : 6.34 27.75 1.73 0.00 0.00 987476.74 762.21 1641249.39 00:08:59.406 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p5 : 6.34 25.23 1.58 0.00 0.00 1075798.56 719.47 1845493.76 00:08:59.406 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p5 : 6.34 27.74 1.73 0.00 0.00 978350.76 740.84 1619366.07 00:08:59.406 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p6 : 6.34 25.23 1.58 0.00 0.00 1065834.20 740.84 1823610.43 00:08:59.406 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p6 : 6.35 27.73 1.73 0.00 0.00 969076.90 730.16 1597482.74 00:08:59.406 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x20 00:08:59.406 Malloc2p7 : 6.35 25.22 1.58 0.00 0.00 1055799.44 683.85 1794432.67 00:08:59.406 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x20 length 0x20 00:08:59.406 Malloc2p7 : 6.35 27.72 1.73 0.00 0.00 960317.98 701.66 1568304.97 00:08:59.406 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x100 00:08:59.406 TestPT : 6.77 35.74 2.23 0.00 0.00 2857186.86 109416.63 3997354.07 00:08:59.406 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x100 length 0x100 00:08:59.406 TestPT : 6.84 35.11 2.19 0.00 0.00 2893873.15 101666.28 3647220.87 00:08:59.406 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x200 00:08:59.406 raid0 : 6.67 40.81 2.55 0.00 0.00 2433445.65 1880.60 4726798.25 00:08:59.406 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x200 length 0x200 00:08:59.406 raid0 : 6.70 42.96 2.68 0.00 0.00 2304241.41 1866.35 4493376.11 00:08:59.406 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x200 00:08:59.406 concat0 : 6.72 54.00 3.37 0.00 0.00 1801936.26 1837.86 4551731.65 00:08:59.406 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x200 length 0x200 00:08:59.406 concat0 : 6.90 48.71 3.04 0.00 0.00 1988679.11 1866.35 4318309.51 00:08:59.406 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x100 00:08:59.406 raid1 : 6.84 52.94 3.31 0.00 0.00 1774972.01 2721.17 4376665.04 00:08:59.406 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x100 length 0x100 00:08:59.406 raid1 : 6.90 53.33 3.33 0.00 0.00 1771806.97 2735.42 4114065.14 00:08:59.406 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x0 length 0x4e 00:08:59.406 AIO0 : 6.94 65.95 4.12 0.00 0.00 856861.97 829.89 2801065.63 00:08:59.406 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:59.406 Verification LBA range: start 0x4e length 0x4e 00:08:59.406 AIO0 : 6.95 77.13 4.82 0.00 0.00 732662.50 829.89 2509287.96 00:08:59.406 =================================================================================================================== 00:08:59.406 Total : 1408.80 88.05 0.00 0.00 1473933.85 683.85 5339531.35 00:08:59.406 00:08:59.406 real 0m8.288s 00:08:59.406 user 0m15.535s 00:08:59.406 sys 0m0.457s 00:08:59.406 11:45:25 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:59.406 11:45:25 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:59.406 ************************************ 00:08:59.406 END TEST bdev_verify_big_io 00:08:59.406 ************************************ 00:08:59.406 11:45:25 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:59.406 11:45:25 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:08:59.406 11:45:25 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:59.406 11:45:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:59.406 ************************************ 00:08:59.406 START TEST bdev_write_zeroes 00:08:59.406 ************************************ 00:08:59.406 11:45:25 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:59.407 [2024-05-14 11:45:25.859498] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:08:59.407 [2024-05-14 11:45:25.859558] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655426 ] 00:08:59.407 [2024-05-14 11:45:25.977414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.407 [2024-05-14 11:45:26.078502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.407 [2024-05-14 11:45:26.239783] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:59.407 [2024-05-14 11:45:26.239839] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:59.407 [2024-05-14 11:45:26.239854] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:59.407 [2024-05-14 11:45:26.247789] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:59.407 [2024-05-14 11:45:26.247817] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:59.407 [2024-05-14 11:45:26.255805] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:59.407 [2024-05-14 11:45:26.255831] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:59.407 [2024-05-14 11:45:26.333097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:59.407 [2024-05-14 11:45:26.333144] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:59.407 [2024-05-14 11:45:26.333163] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e4b60 00:08:59.407 [2024-05-14 11:45:26.333175] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:59.407 [2024-05-14 11:45:26.334682] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:59.407 [2024-05-14 11:45:26.334710] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:59.666 Running I/O for 1 seconds... 00:09:00.600 00:09:00.600 Latency(us) 00:09:00.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:00.600 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc0 : 1.05 5016.07 19.59 0.00 0.00 25498.20 644.67 42170.99 00:09:00.600 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc1p0 : 1.05 5008.83 19.57 0.00 0.00 25489.86 894.00 41487.14 00:09:00.600 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc1p1 : 1.05 5001.49 19.54 0.00 0.00 25476.33 886.87 40575.33 00:09:00.600 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc2p0 : 1.05 4994.32 19.51 0.00 0.00 25455.76 886.87 39663.53 00:09:00.600 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc2p1 : 1.05 4987.26 19.48 0.00 0.00 25436.79 886.87 38751.72 00:09:00.600 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc2p2 : 1.05 4980.20 19.45 0.00 0.00 25415.17 886.87 37839.92 00:09:00.600 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.600 Malloc2p3 : 1.06 4973.14 19.43 0.00 0.00 25395.62 886.87 36928.11 00:09:00.601 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 Malloc2p4 : 1.06 4966.14 19.40 0.00 0.00 25380.07 883.31 36244.26 00:09:00.601 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 Malloc2p5 : 1.06 4959.18 19.37 0.00 0.00 25359.60 890.43 35332.45 00:09:00.601 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 Malloc2p6 : 1.06 4952.15 19.34 0.00 0.00 25338.48 883.31 34420.65 00:09:00.601 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 Malloc2p7 : 1.06 4945.22 19.32 0.00 0.00 25323.33 883.31 33508.84 00:09:00.601 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 TestPT : 1.06 4938.29 19.29 0.00 0.00 25303.41 926.05 32597.04 00:09:00.601 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 raid0 : 1.06 4930.34 19.26 0.00 0.00 25277.43 1602.78 31001.38 00:09:00.601 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 concat0 : 1.07 4922.51 19.23 0.00 0.00 25226.34 1588.54 29405.72 00:09:00.601 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 raid1 : 1.07 4912.78 19.19 0.00 0.00 25166.06 2521.71 26898.25 00:09:00.601 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:00.601 AIO0 : 1.07 4906.83 19.17 0.00 0.00 25076.31 1018.66 26100.42 00:09:00.601 =================================================================================================================== 00:09:00.601 Total : 79394.76 310.14 0.00 0.00 25351.17 644.67 42170.99 00:09:01.166 00:09:01.166 real 0m2.207s 00:09:01.166 user 0m1.793s 00:09:01.166 sys 0m0.337s 00:09:01.166 11:45:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:01.166 11:45:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:01.166 ************************************ 00:09:01.166 END TEST bdev_write_zeroes 00:09:01.166 ************************************ 00:09:01.166 11:45:28 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:01.166 11:45:28 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:09:01.166 11:45:28 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:01.166 11:45:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.166 ************************************ 00:09:01.166 START TEST bdev_json_nonenclosed 00:09:01.166 ************************************ 00:09:01.166 11:45:28 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:01.166 [2024-05-14 11:45:28.157781] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:01.166 [2024-05-14 11:45:28.157841] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655789 ] 00:09:01.424 [2024-05-14 11:45:28.287683] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.424 [2024-05-14 11:45:28.388645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.424 [2024-05-14 11:45:28.388713] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:01.424 [2024-05-14 11:45:28.388734] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:01.424 [2024-05-14 11:45:28.388746] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:01.424 00:09:01.424 real 0m0.393s 00:09:01.424 user 0m0.240s 00:09:01.424 sys 0m0.151s 00:09:01.424 11:45:28 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:01.424 11:45:28 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:01.424 ************************************ 00:09:01.424 END TEST bdev_json_nonenclosed 00:09:01.424 ************************************ 00:09:01.682 11:45:28 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:01.682 11:45:28 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:09:01.682 11:45:28 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:01.682 11:45:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.682 ************************************ 00:09:01.682 START TEST bdev_json_nonarray 00:09:01.682 ************************************ 00:09:01.682 11:45:28 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:01.682 [2024-05-14 11:45:28.648977] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:01.682 [2024-05-14 11:45:28.649046] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655810 ] 00:09:01.940 [2024-05-14 11:45:28.781785] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.940 [2024-05-14 11:45:28.891438] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.940 [2024-05-14 11:45:28.891515] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:01.940 [2024-05-14 11:45:28.891537] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:01.940 [2024-05-14 11:45:28.891550] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:01.940 00:09:01.940 real 0m0.411s 00:09:01.940 user 0m0.234s 00:09:01.940 sys 0m0.173s 00:09:01.940 11:45:28 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:01.940 11:45:28 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:01.940 ************************************ 00:09:01.940 END TEST bdev_json_nonarray 00:09:01.940 ************************************ 00:09:02.198 11:45:29 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:02.198 11:45:29 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:02.198 11:45:29 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:02.198 11:45:29 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:02.198 11:45:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.198 ************************************ 00:09:02.198 START TEST bdev_qos 00:09:02.198 ************************************ 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1121 -- # qos_test_suite '' 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=1655976 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 1655976' 00:09:02.198 Process qos testing pid: 1655976 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 1655976 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@827 -- # '[' -z 1655976 ']' 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:02.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:02.198 11:45:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:02.198 [2024-05-14 11:45:29.151887] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:02.198 [2024-05-14 11:45:29.151954] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655976 ] 00:09:02.198 [2024-05-14 11:45:29.274152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.456 [2024-05-14 11:45:29.371975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:03.022 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:03.022 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # return 0 00:09:03.022 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:03.022 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.022 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.281 Malloc_0 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_0 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.281 [ 00:09:03.281 { 00:09:03.281 "name": "Malloc_0", 00:09:03.281 "aliases": [ 00:09:03.281 "19153496-2a97-4af8-a16a-c2757f4430b8" 00:09:03.281 ], 00:09:03.281 "product_name": "Malloc disk", 00:09:03.281 "block_size": 512, 00:09:03.281 "num_blocks": 262144, 00:09:03.281 "uuid": "19153496-2a97-4af8-a16a-c2757f4430b8", 00:09:03.281 "assigned_rate_limits": { 00:09:03.281 "rw_ios_per_sec": 0, 00:09:03.281 "rw_mbytes_per_sec": 0, 00:09:03.281 "r_mbytes_per_sec": 0, 00:09:03.281 "w_mbytes_per_sec": 0 00:09:03.281 }, 00:09:03.281 "claimed": false, 00:09:03.281 "zoned": false, 00:09:03.281 "supported_io_types": { 00:09:03.281 "read": true, 00:09:03.281 "write": true, 00:09:03.281 "unmap": true, 00:09:03.281 "write_zeroes": true, 00:09:03.281 "flush": true, 00:09:03.281 "reset": true, 00:09:03.281 "compare": false, 00:09:03.281 "compare_and_write": false, 00:09:03.281 "abort": true, 00:09:03.281 "nvme_admin": false, 00:09:03.281 "nvme_io": false 00:09:03.281 }, 00:09:03.281 "memory_domains": [ 00:09:03.281 { 00:09:03.281 "dma_device_id": "system", 00:09:03.281 "dma_device_type": 1 00:09:03.281 }, 00:09:03.281 { 00:09:03.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:03.281 "dma_device_type": 2 00:09:03.281 } 00:09:03.281 ], 00:09:03.281 "driver_specific": {} 00:09:03.281 } 00:09:03.281 ] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.281 Null_1 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Null_1 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.281 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:03.282 [ 00:09:03.282 { 00:09:03.282 "name": "Null_1", 00:09:03.282 "aliases": [ 00:09:03.282 "e656a5c9-5af0-47f2-91f6-32faa97e8504" 00:09:03.282 ], 00:09:03.282 "product_name": "Null disk", 00:09:03.282 "block_size": 512, 00:09:03.282 "num_blocks": 262144, 00:09:03.282 "uuid": "e656a5c9-5af0-47f2-91f6-32faa97e8504", 00:09:03.282 "assigned_rate_limits": { 00:09:03.282 "rw_ios_per_sec": 0, 00:09:03.282 "rw_mbytes_per_sec": 0, 00:09:03.282 "r_mbytes_per_sec": 0, 00:09:03.282 "w_mbytes_per_sec": 0 00:09:03.282 }, 00:09:03.282 "claimed": false, 00:09:03.282 "zoned": false, 00:09:03.282 "supported_io_types": { 00:09:03.282 "read": true, 00:09:03.282 "write": true, 00:09:03.282 "unmap": false, 00:09:03.282 "write_zeroes": true, 00:09:03.282 "flush": false, 00:09:03.282 "reset": true, 00:09:03.282 "compare": false, 00:09:03.282 "compare_and_write": false, 00:09:03.282 "abort": true, 00:09:03.282 "nvme_admin": false, 00:09:03.282 "nvme_io": false 00:09:03.282 }, 00:09:03.282 "driver_specific": {} 00:09:03.282 } 00:09:03.282 ] 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:03.282 11:45:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:03.282 Running I/O for 60 seconds... 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 60994.44 243977.76 0.00 0.00 245760.00 0.00 0.00 ' 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=60994.44 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 60994 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=60994 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:08.552 11:45:35 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:08.552 ************************************ 00:09:08.552 START TEST bdev_qos_iops 00:09:08.552 ************************************ 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1121 -- # run_qos_test 15000 IOPS Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:08.552 11:45:35 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 15002.75 60011.02 0.00 0.00 61200.00 0.00 0.00 ' 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=15002.75 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 15002 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=15002 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15002 -lt 13500 ']' 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15002 -gt 16500 ']' 00:09:13.860 00:09:13.860 real 0m5.261s 00:09:13.860 user 0m0.116s 00:09:13.860 sys 0m0.043s 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:13.860 11:45:40 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:13.860 ************************************ 00:09:13.860 END TEST bdev_qos_iops 00:09:13.860 ************************************ 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:13.860 11:45:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 19861.75 79447.01 0.00 0.00 80896.00 0.00 0.00 ' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=80896.00 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 80896 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=80896 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=7 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 7 -lt 2 ']' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 7 Null_1 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 7 BANDWIDTH Null_1 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:19.150 11:45:45 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:19.150 11:45:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:19.150 ************************************ 00:09:19.150 START TEST bdev_qos_bw 00:09:19.150 ************************************ 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1121 -- # run_qos_test 7 BANDWIDTH Null_1 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=7 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:19.150 11:45:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1791.76 7167.05 0.00 0.00 7312.00 0.00 0.00 ' 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=7312.00 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 7312 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=7312 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=7168 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=6451 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=7884 00:09:24.419 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7312 -lt 6451 ']' 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7312 -gt 7884 ']' 00:09:24.420 00:09:24.420 real 0m5.277s 00:09:24.420 user 0m0.113s 00:09:24.420 sys 0m0.049s 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:24.420 ************************************ 00:09:24.420 END TEST bdev_qos_bw 00:09:24.420 ************************************ 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:24.420 11:45:51 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:24.420 ************************************ 00:09:24.420 START TEST bdev_qos_ro_bw 00:09:24.420 ************************************ 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1121 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:24.420 11:45:51 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.72 2046.87 0.00 0.00 2060.00 0.00 0.00 ' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:09:29.689 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:09:29.689 00:09:29.689 real 0m5.184s 00:09:29.689 user 0m0.116s 00:09:29.690 sys 0m0.041s 00:09:29.690 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:29.690 11:45:56 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:29.690 ************************************ 00:09:29.690 END TEST bdev_qos_ro_bw 00:09:29.690 ************************************ 00:09:29.690 11:45:56 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:29.690 11:45:56 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:29.690 11:45:56 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:30.257 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:30.257 11:45:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:30.257 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:30.257 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:30.516 00:09:30.516 Latency(us) 00:09:30.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:30.516 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:30.516 Malloc_0 : 26.83 20551.33 80.28 0.00 0.00 12341.33 2065.81 503316.48 00:09:30.516 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:30.516 Null_1 : 27.00 19971.93 78.02 0.00 0.00 12783.03 865.50 164124.94 00:09:30.516 =================================================================================================================== 00:09:30.516 Total : 40523.26 158.29 0.00 0.00 12559.70 865.50 503316.48 00:09:30.516 0 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 1655976 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@946 -- # '[' -z 1655976 ']' 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # kill -0 1655976 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # uname 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1655976 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1655976' 00:09:30.516 killing process with pid 1655976 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@965 -- # kill 1655976 00:09:30.516 Received shutdown signal, test time was about 27.062575 seconds 00:09:30.516 00:09:30.516 Latency(us) 00:09:30.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:30.516 =================================================================================================================== 00:09:30.516 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:30.516 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@970 -- # wait 1655976 00:09:30.776 11:45:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:30.776 00:09:30.776 real 0m28.548s 00:09:30.776 user 0m29.301s 00:09:30.776 sys 0m0.853s 00:09:30.776 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:30.776 11:45:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:30.776 ************************************ 00:09:30.776 END TEST bdev_qos 00:09:30.776 ************************************ 00:09:30.776 11:45:57 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:30.776 11:45:57 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:30.776 11:45:57 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:30.776 11:45:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:30.776 ************************************ 00:09:30.776 START TEST bdev_qd_sampling 00:09:30.776 ************************************ 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1121 -- # qd_sampling_test_suite '' 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=1659795 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 1659795' 00:09:30.776 Process bdev QD sampling period testing pid: 1659795 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 1659795 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@827 -- # '[' -z 1659795 ']' 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:30.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:30.776 11:45:57 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:30.776 [2024-05-14 11:45:57.800986] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:30.776 [2024-05-14 11:45:57.801058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659795 ] 00:09:31.034 [2024-05-14 11:45:57.932575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:31.034 [2024-05-14 11:45:58.036005] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:31.035 [2024-05-14 11:45:58.036010] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # return 0 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:31.970 Malloc_QD 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_QD 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local i 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:31.970 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:31.970 [ 00:09:31.970 { 00:09:31.970 "name": "Malloc_QD", 00:09:31.970 "aliases": [ 00:09:31.970 "149ed924-de2a-47fb-9fae-0a4e9d57f976" 00:09:31.970 ], 00:09:31.971 "product_name": "Malloc disk", 00:09:31.971 "block_size": 512, 00:09:31.971 "num_blocks": 262144, 00:09:31.971 "uuid": "149ed924-de2a-47fb-9fae-0a4e9d57f976", 00:09:31.971 "assigned_rate_limits": { 00:09:31.971 "rw_ios_per_sec": 0, 00:09:31.971 "rw_mbytes_per_sec": 0, 00:09:31.971 "r_mbytes_per_sec": 0, 00:09:31.971 "w_mbytes_per_sec": 0 00:09:31.971 }, 00:09:31.971 "claimed": false, 00:09:31.971 "zoned": false, 00:09:31.971 "supported_io_types": { 00:09:31.971 "read": true, 00:09:31.971 "write": true, 00:09:31.971 "unmap": true, 00:09:31.971 "write_zeroes": true, 00:09:31.971 "flush": true, 00:09:31.971 "reset": true, 00:09:31.971 "compare": false, 00:09:31.971 "compare_and_write": false, 00:09:31.971 "abort": true, 00:09:31.971 "nvme_admin": false, 00:09:31.971 "nvme_io": false 00:09:31.971 }, 00:09:31.971 "memory_domains": [ 00:09:31.971 { 00:09:31.971 "dma_device_id": "system", 00:09:31.971 "dma_device_type": 1 00:09:31.971 }, 00:09:31.971 { 00:09:31.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:31.971 "dma_device_type": 2 00:09:31.971 } 00:09:31.971 ], 00:09:31.971 "driver_specific": {} 00:09:31.971 } 00:09:31.971 ] 00:09:31.971 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:31.971 11:45:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # return 0 00:09:31.971 11:45:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:31.971 11:45:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:31.971 Running I/O for 5 seconds... 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:33.872 "tick_rate": 2300000000, 00:09:33.872 "ticks": 6945958328746756, 00:09:33.872 "bdevs": [ 00:09:33.872 { 00:09:33.872 "name": "Malloc_QD", 00:09:33.872 "bytes_read": 762360320, 00:09:33.872 "num_read_ops": 186116, 00:09:33.872 "bytes_written": 0, 00:09:33.872 "num_write_ops": 0, 00:09:33.872 "bytes_unmapped": 0, 00:09:33.872 "num_unmap_ops": 0, 00:09:33.872 "bytes_copied": 0, 00:09:33.872 "num_copy_ops": 0, 00:09:33.872 "read_latency_ticks": 2236141004434, 00:09:33.872 "max_read_latency_ticks": 15139420, 00:09:33.872 "min_read_latency_ticks": 301478, 00:09:33.872 "write_latency_ticks": 0, 00:09:33.872 "max_write_latency_ticks": 0, 00:09:33.872 "min_write_latency_ticks": 0, 00:09:33.872 "unmap_latency_ticks": 0, 00:09:33.872 "max_unmap_latency_ticks": 0, 00:09:33.872 "min_unmap_latency_ticks": 0, 00:09:33.872 "copy_latency_ticks": 0, 00:09:33.872 "max_copy_latency_ticks": 0, 00:09:33.872 "min_copy_latency_ticks": 0, 00:09:33.872 "io_error": {}, 00:09:33.872 "queue_depth_polling_period": 10, 00:09:33.872 "queue_depth": 512, 00:09:33.872 "io_time": 20, 00:09:33.872 "weighted_io_time": 10240 00:09:33.872 } 00:09:33.872 ] 00:09:33.872 }' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:33.872 00:09:33.872 Latency(us) 00:09:33.872 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:33.872 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:33.872 Malloc_QD : 1.98 48058.11 187.73 0.00 0.00 5313.24 1389.08 6610.59 00:09:33.872 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:33.872 Malloc_QD : 1.98 49827.75 194.64 0.00 0.00 5125.28 933.18 6468.12 00:09:33.872 =================================================================================================================== 00:09:33.872 Total : 97885.86 382.37 0.00 0.00 5217.52 933.18 6610.59 00:09:33.872 0 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 1659795 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@946 -- # '[' -z 1659795 ']' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # kill -0 1659795 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # uname 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:33.872 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1659795 00:09:34.129 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:34.129 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:34.129 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1659795' 00:09:34.129 killing process with pid 1659795 00:09:34.130 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@965 -- # kill 1659795 00:09:34.130 Received shutdown signal, test time was about 2.061474 seconds 00:09:34.130 00:09:34.130 Latency(us) 00:09:34.130 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:34.130 =================================================================================================================== 00:09:34.130 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:34.130 11:46:00 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@970 -- # wait 1659795 00:09:34.130 11:46:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:34.130 00:09:34.130 real 0m3.440s 00:09:34.130 user 0m6.724s 00:09:34.130 sys 0m0.450s 00:09:34.130 11:46:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:34.130 11:46:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:34.130 ************************************ 00:09:34.130 END TEST bdev_qd_sampling 00:09:34.130 ************************************ 00:09:34.388 11:46:01 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:34.388 11:46:01 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:34.388 11:46:01 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:34.388 11:46:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:34.388 ************************************ 00:09:34.388 START TEST bdev_error 00:09:34.388 ************************************ 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@1121 -- # error_test_suite '' 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=1660324 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 1660324' 00:09:34.388 Process error testing pid: 1660324 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:34.388 11:46:01 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 1660324 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 1660324 ']' 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:34.388 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:34.388 11:46:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:34.388 [2024-05-14 11:46:01.327743] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:34.388 [2024-05-14 11:46:01.327814] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660324 ] 00:09:34.388 [2024-05-14 11:46:01.449193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:34.646 [2024-05-14 11:46:01.555724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:09:35.211 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.211 Dev_1 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.211 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.211 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.469 [ 00:09:35.469 { 00:09:35.469 "name": "Dev_1", 00:09:35.469 "aliases": [ 00:09:35.469 "5e06c91d-04e3-4d5d-9f76-f8e13851fdf3" 00:09:35.469 ], 00:09:35.469 "product_name": "Malloc disk", 00:09:35.469 "block_size": 512, 00:09:35.469 "num_blocks": 262144, 00:09:35.469 "uuid": "5e06c91d-04e3-4d5d-9f76-f8e13851fdf3", 00:09:35.469 "assigned_rate_limits": { 00:09:35.469 "rw_ios_per_sec": 0, 00:09:35.469 "rw_mbytes_per_sec": 0, 00:09:35.469 "r_mbytes_per_sec": 0, 00:09:35.469 "w_mbytes_per_sec": 0 00:09:35.469 }, 00:09:35.469 "claimed": false, 00:09:35.469 "zoned": false, 00:09:35.469 "supported_io_types": { 00:09:35.469 "read": true, 00:09:35.469 "write": true, 00:09:35.469 "unmap": true, 00:09:35.469 "write_zeroes": true, 00:09:35.469 "flush": true, 00:09:35.469 "reset": true, 00:09:35.469 "compare": false, 00:09:35.469 "compare_and_write": false, 00:09:35.469 "abort": true, 00:09:35.469 "nvme_admin": false, 00:09:35.469 "nvme_io": false 00:09:35.469 }, 00:09:35.469 "memory_domains": [ 00:09:35.469 { 00:09:35.469 "dma_device_id": "system", 00:09:35.469 "dma_device_type": 1 00:09:35.469 }, 00:09:35.469 { 00:09:35.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.469 "dma_device_type": 2 00:09:35.469 } 00:09:35.469 ], 00:09:35.469 "driver_specific": {} 00:09:35.469 } 00:09:35.469 ] 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:09:35.469 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.469 true 00:09:35.469 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.470 Dev_2 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.470 [ 00:09:35.470 { 00:09:35.470 "name": "Dev_2", 00:09:35.470 "aliases": [ 00:09:35.470 "3770aec6-3ff4-4bc8-a1ee-a5dca3ac04ea" 00:09:35.470 ], 00:09:35.470 "product_name": "Malloc disk", 00:09:35.470 "block_size": 512, 00:09:35.470 "num_blocks": 262144, 00:09:35.470 "uuid": "3770aec6-3ff4-4bc8-a1ee-a5dca3ac04ea", 00:09:35.470 "assigned_rate_limits": { 00:09:35.470 "rw_ios_per_sec": 0, 00:09:35.470 "rw_mbytes_per_sec": 0, 00:09:35.470 "r_mbytes_per_sec": 0, 00:09:35.470 "w_mbytes_per_sec": 0 00:09:35.470 }, 00:09:35.470 "claimed": false, 00:09:35.470 "zoned": false, 00:09:35.470 "supported_io_types": { 00:09:35.470 "read": true, 00:09:35.470 "write": true, 00:09:35.470 "unmap": true, 00:09:35.470 "write_zeroes": true, 00:09:35.470 "flush": true, 00:09:35.470 "reset": true, 00:09:35.470 "compare": false, 00:09:35.470 "compare_and_write": false, 00:09:35.470 "abort": true, 00:09:35.470 "nvme_admin": false, 00:09:35.470 "nvme_io": false 00:09:35.470 }, 00:09:35.470 "memory_domains": [ 00:09:35.470 { 00:09:35.470 "dma_device_id": "system", 00:09:35.470 "dma_device_type": 1 00:09:35.470 }, 00:09:35.470 { 00:09:35.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.470 "dma_device_type": 2 00:09:35.470 } 00:09:35.470 ], 00:09:35.470 "driver_specific": {} 00:09:35.470 } 00:09:35.470 ] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:09:35.470 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:35.470 11:46:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:35.470 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:35.470 11:46:02 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:35.470 Running I/O for 5 seconds... 00:09:36.405 11:46:03 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 1660324 00:09:36.405 11:46:03 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 1660324' 00:09:36.405 Process is existed as continue on error is set. Pid: 1660324 00:09:36.405 11:46:03 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.405 11:46:03 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:36.405 11:46:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.405 11:46:03 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:36.662 Timeout while waiting for response: 00:09:36.662 00:09:36.662 00:09:40.846 00:09:40.846 Latency(us) 00:09:40.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:40.846 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:40.846 EE_Dev_1 : 0.90 37035.84 144.67 5.56 0.00 428.19 135.35 712.35 00:09:40.846 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:40.846 Dev_2 : 5.00 79664.39 311.19 0.00 0.00 197.08 65.45 22681.15 00:09:40.846 =================================================================================================================== 00:09:40.846 Total : 116700.23 455.86 5.56 0.00 214.91 65.45 22681.15 00:09:41.411 11:46:08 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 1660324 00:09:41.411 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@946 -- # '[' -z 1660324 ']' 00:09:41.411 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # kill -0 1660324 00:09:41.411 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # uname 00:09:41.411 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:41.411 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1660324 00:09:41.669 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:09:41.669 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:09:41.669 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1660324' 00:09:41.669 killing process with pid 1660324 00:09:41.669 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@965 -- # kill 1660324 00:09:41.669 Received shutdown signal, test time was about 5.000000 seconds 00:09:41.669 00:09:41.669 Latency(us) 00:09:41.669 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:41.669 =================================================================================================================== 00:09:41.669 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:41.669 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@970 -- # wait 1660324 00:09:41.927 11:46:08 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=1661241 00:09:41.927 11:46:08 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 1661241' 00:09:41.927 11:46:08 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:41.927 Process error testing pid: 1661241 00:09:41.927 11:46:08 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 1661241 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 1661241 ']' 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:41.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:41.927 11:46:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:41.927 [2024-05-14 11:46:08.830048] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:41.927 [2024-05-14 11:46:08.830114] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661241 ] 00:09:41.927 [2024-05-14 11:46:08.950587] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.185 [2024-05-14 11:46:09.053453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:09:42.750 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:42.750 Dev_1 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:42.750 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:42.750 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:42.751 [ 00:09:42.751 { 00:09:42.751 "name": "Dev_1", 00:09:42.751 "aliases": [ 00:09:42.751 "4ffc5fa1-ecd6-4914-827e-7ab4940629cf" 00:09:42.751 ], 00:09:42.751 "product_name": "Malloc disk", 00:09:42.751 "block_size": 512, 00:09:42.751 "num_blocks": 262144, 00:09:42.751 "uuid": "4ffc5fa1-ecd6-4914-827e-7ab4940629cf", 00:09:42.751 "assigned_rate_limits": { 00:09:42.751 "rw_ios_per_sec": 0, 00:09:42.751 "rw_mbytes_per_sec": 0, 00:09:42.751 "r_mbytes_per_sec": 0, 00:09:42.751 "w_mbytes_per_sec": 0 00:09:42.751 }, 00:09:42.751 "claimed": false, 00:09:42.751 "zoned": false, 00:09:42.751 "supported_io_types": { 00:09:42.751 "read": true, 00:09:42.751 "write": true, 00:09:42.751 "unmap": true, 00:09:42.751 "write_zeroes": true, 00:09:42.751 "flush": true, 00:09:42.751 "reset": true, 00:09:42.751 "compare": false, 00:09:42.751 "compare_and_write": false, 00:09:42.751 "abort": true, 00:09:42.751 "nvme_admin": false, 00:09:42.751 "nvme_io": false 00:09:42.751 }, 00:09:42.751 "memory_domains": [ 00:09:42.751 { 00:09:42.751 "dma_device_id": "system", 00:09:42.751 "dma_device_type": 1 00:09:42.751 }, 00:09:42.751 { 00:09:42.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:42.751 "dma_device_type": 2 00:09:42.751 } 00:09:42.751 ], 00:09:42.751 "driver_specific": {} 00:09:42.751 } 00:09:42.751 ] 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:09:42.751 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:42.751 true 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:42.751 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:42.751 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:43.036 Dev_2 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:43.036 [ 00:09:43.036 { 00:09:43.036 "name": "Dev_2", 00:09:43.036 "aliases": [ 00:09:43.036 "33896a64-0e2b-49be-a569-5415bc7f74b7" 00:09:43.036 ], 00:09:43.036 "product_name": "Malloc disk", 00:09:43.036 "block_size": 512, 00:09:43.036 "num_blocks": 262144, 00:09:43.036 "uuid": "33896a64-0e2b-49be-a569-5415bc7f74b7", 00:09:43.036 "assigned_rate_limits": { 00:09:43.036 "rw_ios_per_sec": 0, 00:09:43.036 "rw_mbytes_per_sec": 0, 00:09:43.036 "r_mbytes_per_sec": 0, 00:09:43.036 "w_mbytes_per_sec": 0 00:09:43.036 }, 00:09:43.036 "claimed": false, 00:09:43.036 "zoned": false, 00:09:43.036 "supported_io_types": { 00:09:43.036 "read": true, 00:09:43.036 "write": true, 00:09:43.036 "unmap": true, 00:09:43.036 "write_zeroes": true, 00:09:43.036 "flush": true, 00:09:43.036 "reset": true, 00:09:43.036 "compare": false, 00:09:43.036 "compare_and_write": false, 00:09:43.036 "abort": true, 00:09:43.036 "nvme_admin": false, 00:09:43.036 "nvme_io": false 00:09:43.036 }, 00:09:43.036 "memory_domains": [ 00:09:43.036 { 00:09:43.036 "dma_device_id": "system", 00:09:43.036 "dma_device_type": 1 00:09:43.036 }, 00:09:43.036 { 00:09:43.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:43.036 "dma_device_type": 2 00:09:43.036 } 00:09:43.036 ], 00:09:43.036 "driver_specific": {} 00:09:43.036 } 00:09:43.036 ] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:09:43.036 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:43.036 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 1661241 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1661241 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:09:43.036 11:46:09 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:43.036 11:46:09 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1661241 00:09:43.036 Running I/O for 5 seconds... 00:09:43.036 task offset: 233704 on job bdev=EE_Dev_1 fails 00:09:43.036 00:09:43.036 Latency(us) 00:09:43.036 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:43.036 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:43.036 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:43.036 EE_Dev_1 : 0.00 23732.47 92.70 5393.74 0.00 451.76 177.20 812.08 00:09:43.036 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:43.036 Dev_2 : 0.00 15108.59 59.02 0.00 0.00 772.90 154.05 1431.82 00:09:43.036 =================================================================================================================== 00:09:43.036 Total : 38841.06 151.72 5393.74 0.00 625.94 154.05 1431.82 00:09:43.036 [2024-05-14 11:46:10.026024] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:43.036 request: 00:09:43.036 { 00:09:43.036 "method": "perform_tests", 00:09:43.036 "req_id": 1 00:09:43.036 } 00:09:43.036 Got JSON-RPC error response 00:09:43.036 response: 00:09:43.036 { 00:09:43.036 "code": -32603, 00:09:43.036 "message": "bdevperf failed with error Operation not permitted" 00:09:43.036 } 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:43.295 00:09:43.295 real 0m9.047s 00:09:43.295 user 0m9.451s 00:09:43.295 sys 0m0.863s 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:43.295 11:46:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:43.295 ************************************ 00:09:43.295 END TEST bdev_error 00:09:43.295 ************************************ 00:09:43.295 11:46:10 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:09:43.295 11:46:10 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:43.295 11:46:10 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:43.295 11:46:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:43.554 ************************************ 00:09:43.554 START TEST bdev_stat 00:09:43.554 ************************************ 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@1121 -- # stat_test_suite '' 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=1661450 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 1661450' 00:09:43.554 Process Bdev IO statistics testing pid: 1661450 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 1661450 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@827 -- # '[' -z 1661450 ']' 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:43.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:43.554 11:46:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:43.554 [2024-05-14 11:46:10.465907] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:43.554 [2024-05-14 11:46:10.465971] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661450 ] 00:09:43.554 [2024-05-14 11:46:10.593140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:43.813 [2024-05-14 11:46:10.700121] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:43.813 [2024-05-14 11:46:10.700126] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # return 0 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:44.380 Malloc_STAT 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_STAT 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local i 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:44.380 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:44.380 [ 00:09:44.380 { 00:09:44.380 "name": "Malloc_STAT", 00:09:44.380 "aliases": [ 00:09:44.380 "fdf2417e-6f81-40b3-a682-702f01d72547" 00:09:44.380 ], 00:09:44.380 "product_name": "Malloc disk", 00:09:44.639 "block_size": 512, 00:09:44.639 "num_blocks": 262144, 00:09:44.639 "uuid": "fdf2417e-6f81-40b3-a682-702f01d72547", 00:09:44.639 "assigned_rate_limits": { 00:09:44.639 "rw_ios_per_sec": 0, 00:09:44.639 "rw_mbytes_per_sec": 0, 00:09:44.639 "r_mbytes_per_sec": 0, 00:09:44.639 "w_mbytes_per_sec": 0 00:09:44.639 }, 00:09:44.639 "claimed": false, 00:09:44.639 "zoned": false, 00:09:44.639 "supported_io_types": { 00:09:44.639 "read": true, 00:09:44.639 "write": true, 00:09:44.639 "unmap": true, 00:09:44.639 "write_zeroes": true, 00:09:44.639 "flush": true, 00:09:44.639 "reset": true, 00:09:44.639 "compare": false, 00:09:44.639 "compare_and_write": false, 00:09:44.639 "abort": true, 00:09:44.639 "nvme_admin": false, 00:09:44.639 "nvme_io": false 00:09:44.639 }, 00:09:44.639 "memory_domains": [ 00:09:44.639 { 00:09:44.639 "dma_device_id": "system", 00:09:44.639 "dma_device_type": 1 00:09:44.639 }, 00:09:44.639 { 00:09:44.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.639 "dma_device_type": 2 00:09:44.639 } 00:09:44.639 ], 00:09:44.639 "driver_specific": {} 00:09:44.639 } 00:09:44.639 ] 00:09:44.639 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:44.639 11:46:11 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # return 0 00:09:44.639 11:46:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:09:44.639 11:46:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:44.639 Running I/O for 10 seconds... 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:09:46.544 "tick_rate": 2300000000, 00:09:46.544 "ticks": 6945987417250238, 00:09:46.544 "bdevs": [ 00:09:46.544 { 00:09:46.544 "name": "Malloc_STAT", 00:09:46.544 "bytes_read": 760263168, 00:09:46.544 "num_read_ops": 185604, 00:09:46.544 "bytes_written": 0, 00:09:46.544 "num_write_ops": 0, 00:09:46.544 "bytes_unmapped": 0, 00:09:46.544 "num_unmap_ops": 0, 00:09:46.544 "bytes_copied": 0, 00:09:46.544 "num_copy_ops": 0, 00:09:46.544 "read_latency_ticks": 2224390282860, 00:09:46.544 "max_read_latency_ticks": 14457082, 00:09:46.544 "min_read_latency_ticks": 268792, 00:09:46.544 "write_latency_ticks": 0, 00:09:46.544 "max_write_latency_ticks": 0, 00:09:46.544 "min_write_latency_ticks": 0, 00:09:46.544 "unmap_latency_ticks": 0, 00:09:46.544 "max_unmap_latency_ticks": 0, 00:09:46.544 "min_unmap_latency_ticks": 0, 00:09:46.544 "copy_latency_ticks": 0, 00:09:46.544 "max_copy_latency_ticks": 0, 00:09:46.544 "min_copy_latency_ticks": 0, 00:09:46.544 "io_error": {} 00:09:46.544 } 00:09:46.544 ] 00:09:46.544 }' 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=185604 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:46.544 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:09:46.544 "tick_rate": 2300000000, 00:09:46.544 "ticks": 6945987577183622, 00:09:46.544 "name": "Malloc_STAT", 00:09:46.544 "channels": [ 00:09:46.544 { 00:09:46.544 "thread_id": 2, 00:09:46.544 "bytes_read": 386924544, 00:09:46.544 "num_read_ops": 94464, 00:09:46.544 "bytes_written": 0, 00:09:46.544 "num_write_ops": 0, 00:09:46.544 "bytes_unmapped": 0, 00:09:46.544 "num_unmap_ops": 0, 00:09:46.544 "bytes_copied": 0, 00:09:46.544 "num_copy_ops": 0, 00:09:46.544 "read_latency_ticks": 1151731863746, 00:09:46.544 "max_read_latency_ticks": 13097638, 00:09:46.544 "min_read_latency_ticks": 7966756, 00:09:46.544 "write_latency_ticks": 0, 00:09:46.544 "max_write_latency_ticks": 0, 00:09:46.544 "min_write_latency_ticks": 0, 00:09:46.544 "unmap_latency_ticks": 0, 00:09:46.544 "max_unmap_latency_ticks": 0, 00:09:46.544 "min_unmap_latency_ticks": 0, 00:09:46.544 "copy_latency_ticks": 0, 00:09:46.544 "max_copy_latency_ticks": 0, 00:09:46.544 "min_copy_latency_ticks": 0 00:09:46.544 }, 00:09:46.544 { 00:09:46.544 "thread_id": 3, 00:09:46.544 "bytes_read": 400556032, 00:09:46.544 "num_read_ops": 97792, 00:09:46.544 "bytes_written": 0, 00:09:46.544 "num_write_ops": 0, 00:09:46.545 "bytes_unmapped": 0, 00:09:46.545 "num_unmap_ops": 0, 00:09:46.545 "bytes_copied": 0, 00:09:46.545 "num_copy_ops": 0, 00:09:46.545 "read_latency_ticks": 1152780449300, 00:09:46.545 "max_read_latency_ticks": 14457082, 00:09:46.545 "min_read_latency_ticks": 7922320, 00:09:46.545 "write_latency_ticks": 0, 00:09:46.545 "max_write_latency_ticks": 0, 00:09:46.545 "min_write_latency_ticks": 0, 00:09:46.545 "unmap_latency_ticks": 0, 00:09:46.545 "max_unmap_latency_ticks": 0, 00:09:46.545 "min_unmap_latency_ticks": 0, 00:09:46.545 "copy_latency_ticks": 0, 00:09:46.545 "max_copy_latency_ticks": 0, 00:09:46.545 "min_copy_latency_ticks": 0 00:09:46.545 } 00:09:46.545 ] 00:09:46.545 }' 00:09:46.545 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=94464 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=94464 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=97792 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=192256 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:09:46.804 "tick_rate": 2300000000, 00:09:46.804 "ticks": 6945987870274108, 00:09:46.804 "bdevs": [ 00:09:46.804 { 00:09:46.804 "name": "Malloc_STAT", 00:09:46.804 "bytes_read": 838906368, 00:09:46.804 "num_read_ops": 204804, 00:09:46.804 "bytes_written": 0, 00:09:46.804 "num_write_ops": 0, 00:09:46.804 "bytes_unmapped": 0, 00:09:46.804 "num_unmap_ops": 0, 00:09:46.804 "bytes_copied": 0, 00:09:46.804 "num_copy_ops": 0, 00:09:46.804 "read_latency_ticks": 2454541499880, 00:09:46.804 "max_read_latency_ticks": 14457082, 00:09:46.804 "min_read_latency_ticks": 268792, 00:09:46.804 "write_latency_ticks": 0, 00:09:46.804 "max_write_latency_ticks": 0, 00:09:46.804 "min_write_latency_ticks": 0, 00:09:46.804 "unmap_latency_ticks": 0, 00:09:46.804 "max_unmap_latency_ticks": 0, 00:09:46.804 "min_unmap_latency_ticks": 0, 00:09:46.804 "copy_latency_ticks": 0, 00:09:46.804 "max_copy_latency_ticks": 0, 00:09:46.804 "min_copy_latency_ticks": 0, 00:09:46.804 "io_error": {} 00:09:46.804 } 00:09:46.804 ] 00:09:46.804 }' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=204804 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192256 -lt 185604 ']' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192256 -gt 204804 ']' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:46.804 00:09:46.804 Latency(us) 00:09:46.804 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:46.804 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:46.804 Malloc_STAT : 2.17 48211.35 188.33 0.00 0.00 5297.16 1353.46 5698.78 00:09:46.804 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:46.804 Malloc_STAT : 2.17 49927.82 195.03 0.00 0.00 5115.43 990.16 6297.15 00:09:46.804 =================================================================================================================== 00:09:46.804 Total : 98139.17 383.36 0.00 0.00 5204.65 990.16 6297.15 00:09:46.804 0 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 1661450 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@946 -- # '[' -z 1661450 ']' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # kill -0 1661450 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # uname 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1661450 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1661450' 00:09:46.804 killing process with pid 1661450 00:09:46.804 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@965 -- # kill 1661450 00:09:46.805 Received shutdown signal, test time was about 2.250574 seconds 00:09:46.805 00:09:46.805 Latency(us) 00:09:46.805 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:46.805 =================================================================================================================== 00:09:46.805 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:46.805 11:46:13 blockdev_general.bdev_stat -- common/autotest_common.sh@970 -- # wait 1661450 00:09:47.064 11:46:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:09:47.064 00:09:47.064 real 0m3.671s 00:09:47.064 user 0m7.348s 00:09:47.064 sys 0m0.483s 00:09:47.064 11:46:14 blockdev_general.bdev_stat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:47.064 11:46:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:47.064 ************************************ 00:09:47.064 END TEST bdev_stat 00:09:47.064 ************************************ 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:47.064 11:46:14 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:47.064 00:09:47.064 real 1m57.285s 00:09:47.064 user 7m11.581s 00:09:47.064 sys 0m22.923s 00:09:47.064 11:46:14 blockdev_general -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:47.064 11:46:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:47.064 ************************************ 00:09:47.064 END TEST blockdev_general 00:09:47.064 ************************************ 00:09:47.324 11:46:14 -- spdk/autotest.sh@186 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:47.324 11:46:14 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:09:47.324 11:46:14 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:47.324 11:46:14 -- common/autotest_common.sh@10 -- # set +x 00:09:47.324 ************************************ 00:09:47.324 START TEST bdev_raid 00:09:47.324 ************************************ 00:09:47.324 11:46:14 bdev_raid -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:47.324 * Looking for test storage... 00:09:47.324 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@12 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:47.324 11:46:14 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@14 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@800 -- # trap 'on_error_exit;' ERR 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@802 -- # base_blocklen=512 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@804 -- # uname -s 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@804 -- # '[' Linux = Linux ']' 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@804 -- # modprobe -n nbd 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@805 -- # has_nbd=true 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@806 -- # modprobe nbd 00:09:47.324 11:46:14 bdev_raid -- bdev/bdev_raid.sh@807 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:47.324 11:46:14 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:47.324 11:46:14 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:47.324 11:46:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:47.324 ************************************ 00:09:47.324 START TEST raid_function_test_raid0 00:09:47.324 ************************************ 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1121 -- # raid_function_test raid0 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local raid_level=raid0 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # raid_pid=1662064 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 1662064' 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:47.324 Process raid pid: 1662064 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@88 -- # waitforlisten 1662064 /var/tmp/spdk-raid.sock 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@827 -- # '[' -z 1662064 ']' 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:47.324 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:47.324 11:46:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:47.584 [2024-05-14 11:46:14.453962] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:47.584 [2024-05-14 11:46:14.454023] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:47.584 [2024-05-14 11:46:14.584764] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:47.842 [2024-05-14 11:46:14.689137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.842 [2024-05-14 11:46:14.754732] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:47.842 [2024-05-14 11:46:14.754768] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # return 0 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev raid0 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # local raid_level=raid0 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@70 -- # cat 00:09:48.410 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:48.668 [2024-05-14 11:46:15.555197] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:48.668 [2024-05-14 11:46:15.556665] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:48.668 [2024-05-14 11:46:15.556722] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e231f0 00:09:48.668 [2024-05-14 11:46:15.556733] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:48.668 [2024-05-14 11:46:15.556925] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c864e0 00:09:48.668 [2024-05-14 11:46:15.557047] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e231f0 00:09:48.668 [2024-05-14 11:46:15.557057] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1e231f0 00:09:48.668 [2024-05-14 11:46:15.557158] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:48.668 Base_1 00:09:48.669 Base_2 00:09:48.669 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:48.669 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:48.669 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:09:48.927 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:48.928 11:46:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:49.186 [2024-05-14 11:46:16.048539] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c66ce0 00:09:49.186 /dev/nbd0 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@865 -- # local i 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # break 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.186 1+0 records in 00:09:49.186 1+0 records out 00:09:49.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250154 s, 16.4 MB/s 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # size=4096 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # return 0 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:49.186 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:49.444 { 00:09:49.444 "nbd_device": "/dev/nbd0", 00:09:49.444 "bdev_name": "raid" 00:09:49.444 } 00:09:49.444 ]' 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:49.444 { 00:09:49.444 "nbd_device": "/dev/nbd0", 00:09:49.444 "bdev_name": "raid" 00:09:49.444 } 00:09:49.444 ]' 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # count=1 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local blksize 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # blksize=512 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:09:49.444 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:09:49.445 4096+0 records in 00:09:49.445 4096+0 records out 00:09:49.445 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0272115 s, 77.1 MB/s 00:09:49.445 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:49.703 4096+0 records in 00:09:49.703 4096+0 records out 00:09:49.703 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.198037 s, 10.6 MB/s 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:09:49.703 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:49.704 128+0 records in 00:09:49.704 128+0 records out 00:09:49.704 65536 bytes (66 kB, 64 KiB) copied, 0.000834353 s, 78.5 MB/s 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:49.704 2035+0 records in 00:09:49.704 2035+0 records out 00:09:49.704 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0120271 s, 86.6 MB/s 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:49.704 456+0 records in 00:09:49.704 456+0 records out 00:09:49.704 233472 bytes (233 kB, 228 KiB) copied, 0.00271124 s, 86.1 MB/s 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@53 -- # return 0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:49.704 11:46:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:49.963 [2024-05-14 11:46:17.010919] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:49.963 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:50.222 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:50.222 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:50.222 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # count=0 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@111 -- # killprocess 1662064 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@946 -- # '[' -z 1662064 ']' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # kill -0 1662064 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # uname 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1662064 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1662064' 00:09:50.481 killing process with pid 1662064 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@965 -- # kill 1662064 00:09:50.481 [2024-05-14 11:46:17.369576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:50.481 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@970 -- # wait 1662064 00:09:50.481 [2024-05-14 11:46:17.369647] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:50.481 [2024-05-14 11:46:17.369695] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:50.481 [2024-05-14 11:46:17.369707] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e231f0 name raid, state offline 00:09:50.481 [2024-05-14 11:46:17.389049] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:50.739 11:46:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@113 -- # return 0 00:09:50.739 00:09:50.739 real 0m3.211s 00:09:50.739 user 0m4.224s 00:09:50.739 sys 0m1.210s 00:09:50.739 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:50.739 11:46:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:50.739 ************************************ 00:09:50.739 END TEST raid_function_test_raid0 00:09:50.739 ************************************ 00:09:50.739 11:46:17 bdev_raid -- bdev/bdev_raid.sh@808 -- # run_test raid_function_test_concat raid_function_test concat 00:09:50.739 11:46:17 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:09:50.739 11:46:17 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:50.739 11:46:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:50.739 ************************************ 00:09:50.739 START TEST raid_function_test_concat 00:09:50.739 ************************************ 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1121 -- # raid_function_test concat 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local raid_level=concat 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # raid_pid=1662666 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 1662666' 00:09:50.739 Process raid pid: 1662666 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@88 -- # waitforlisten 1662666 /var/tmp/spdk-raid.sock 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@827 -- # '[' -z 1662666 ']' 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:50.739 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:50.739 11:46:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:50.739 [2024-05-14 11:46:17.748967] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:50.739 [2024-05-14 11:46:17.749028] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:50.997 [2024-05-14 11:46:17.880161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:50.997 [2024-05-14 11:46:17.986512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.997 [2024-05-14 11:46:18.055269] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:50.997 [2024-05-14 11:46:18.055305] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # return 0 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev concat 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # local raid_level=concat 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@70 -- # cat 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:51.931 [2024-05-14 11:46:18.939976] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:51.931 [2024-05-14 11:46:18.941440] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:51.931 [2024-05-14 11:46:18.941504] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x17d01f0 00:09:51.931 [2024-05-14 11:46:18.941515] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:51.931 [2024-05-14 11:46:18.941704] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16334e0 00:09:51.931 [2024-05-14 11:46:18.941826] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17d01f0 00:09:51.931 [2024-05-14 11:46:18.941836] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x17d01f0 00:09:51.931 [2024-05-14 11:46:18.941934] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:51.931 Base_1 00:09:51.931 Base_2 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:51.931 11:46:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:52.189 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:52.190 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:52.190 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:52.190 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:52.190 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:52.190 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:52.447 [2024-05-14 11:46:19.437311] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1613ce0 00:09:52.447 /dev/nbd0 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@865 -- # local i 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # break 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:52.447 1+0 records in 00:09:52.447 1+0 records out 00:09:52.447 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255476 s, 16.0 MB/s 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # size=4096 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # return 0 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:52.447 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:52.705 { 00:09:52.705 "nbd_device": "/dev/nbd0", 00:09:52.705 "bdev_name": "raid" 00:09:52.705 } 00:09:52.705 ]' 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:52.705 { 00:09:52.705 "nbd_device": "/dev/nbd0", 00:09:52.705 "bdev_name": "raid" 00:09:52.705 } 00:09:52.705 ]' 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # count=1 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:09:52.705 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local blksize 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # blksize=512 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:09:52.962 4096+0 records in 00:09:52.962 4096+0 records out 00:09:52.962 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0280938 s, 74.6 MB/s 00:09:52.962 11:46:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:52.962 4096+0 records in 00:09:52.962 4096+0 records out 00:09:52.962 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.198383 s, 10.6 MB/s 00:09:52.962 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:09:52.962 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:52.962 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:09:52.962 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:52.962 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:53.220 128+0 records in 00:09:53.220 128+0 records out 00:09:53.220 65536 bytes (66 kB, 64 KiB) copied, 0.000856893 s, 76.5 MB/s 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:53.220 2035+0 records in 00:09:53.220 2035+0 records out 00:09:53.220 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0109233 s, 95.4 MB/s 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:53.220 456+0 records in 00:09:53.220 456+0 records out 00:09:53.220 233472 bytes (233 kB, 228 KiB) copied, 0.00283053 s, 82.5 MB/s 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@53 -- # return 0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:53.220 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:53.221 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:53.478 [2024-05-14 11:46:20.378381] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:53.478 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # count=0 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@111 -- # killprocess 1662666 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@946 -- # '[' -z 1662666 ']' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # kill -0 1662666 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # uname 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1662666 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1662666' 00:09:53.736 killing process with pid 1662666 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@965 -- # kill 1662666 00:09:53.736 [2024-05-14 11:46:20.751696] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:53.736 [2024-05-14 11:46:20.751771] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:53.736 [2024-05-14 11:46:20.751820] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:53.736 [2024-05-14 11:46:20.751834] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d01f0 name raid, state offline 00:09:53.736 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@970 -- # wait 1662666 00:09:53.736 [2024-05-14 11:46:20.768753] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:53.994 11:46:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@113 -- # return 0 00:09:53.994 00:09:53.994 real 0m3.291s 00:09:53.994 user 0m4.453s 00:09:53.994 sys 0m1.162s 00:09:53.994 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:53.994 11:46:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:53.994 ************************************ 00:09:53.994 END TEST raid_function_test_concat 00:09:53.994 ************************************ 00:09:53.994 11:46:21 bdev_raid -- bdev/bdev_raid.sh@811 -- # run_test raid0_resize_test raid0_resize_test 00:09:53.994 11:46:21 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:09:53.994 11:46:21 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:53.994 11:46:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:53.994 ************************************ 00:09:53.994 START TEST raid0_resize_test 00:09:53.994 ************************************ 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1121 -- # raid0_resize_test 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # raid_pid=1663118 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # echo 'Process raid pid: 1663118' 00:09:53.994 Process raid pid: 1663118 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # waitforlisten 1663118 /var/tmp/spdk-raid.sock 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@827 -- # '[' -z 1663118 ']' 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:53.994 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:53.994 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:53.995 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:53.995 11:46:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:53.995 11:46:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:54.253 [2024-05-14 11:46:21.118827] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:54.253 [2024-05-14 11:46:21.118887] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:54.253 [2024-05-14 11:46:21.246011] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.512 [2024-05-14 11:46:21.349235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.512 [2024-05-14 11:46:21.412697] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:54.512 [2024-05-14 11:46:21.412731] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:55.079 11:46:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:55.079 11:46:22 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # return 0 00:09:55.079 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:55.338 Base_1 00:09:55.338 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:55.596 Base_2 00:09:55.596 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@363 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:55.855 [2024-05-14 11:46:22.738832] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:55.855 [2024-05-14 11:46:22.740409] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:55.855 [2024-05-14 11:46:22.740460] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x17dbd90 00:09:55.855 [2024-05-14 11:46:22.740470] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:55.855 [2024-05-14 11:46:22.740681] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x162bfc0 00:09:55.855 [2024-05-14 11:46:22.740787] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17dbd90 00:09:55.855 [2024-05-14 11:46:22.740796] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x17dbd90 00:09:55.855 [2024-05-14 11:46:22.740909] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:55.855 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@366 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:56.113 [2024-05-14 11:46:22.975433] bdev_raid.c:2216:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:56.113 [2024-05-14 11:46:22.975457] bdev_raid.c:2229:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:56.113 true 00:09:56.113 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:56.113 11:46:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # jq '.[].num_blocks' 00:09:56.372 [2024-05-14 11:46:23.216191] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:56.372 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # blkcnt=131072 00:09:56.372 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # raid_size_mb=64 00:09:56.372 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # '[' 64 '!=' 64 ']' 00:09:56.372 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:56.372 [2024-05-14 11:46:23.456679] bdev_raid.c:2216:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:56.372 [2024-05-14 11:46:23.456700] bdev_raid.c:2229:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:56.372 [2024-05-14 11:46:23.456724] bdev_raid.c:2243:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:56.630 true 00:09:56.630 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:56.630 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # jq '.[].num_blocks' 00:09:56.630 [2024-05-14 11:46:23.697446] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # blkcnt=262144 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # raid_size_mb=128 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@382 -- # '[' 128 '!=' 128 ']' 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # killprocess 1663118 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@946 -- # '[' -z 1663118 ']' 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # kill -0 1663118 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # uname 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1663118 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1663118' 00:09:56.889 killing process with pid 1663118 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@965 -- # kill 1663118 00:09:56.889 [2024-05-14 11:46:23.767423] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:56.889 [2024-05-14 11:46:23.767489] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:56.889 [2024-05-14 11:46:23.767536] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:56.889 [2024-05-14 11:46:23.767549] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dbd90 name Raid, state offline 00:09:56.889 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@970 -- # wait 1663118 00:09:56.889 [2024-05-14 11:46:23.768882] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:57.148 11:46:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@389 -- # return 0 00:09:57.148 00:09:57.148 real 0m2.917s 00:09:57.148 user 0m4.474s 00:09:57.148 sys 0m0.640s 00:09:57.148 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:57.148 11:46:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:57.148 ************************************ 00:09:57.148 END TEST raid0_resize_test 00:09:57.148 ************************************ 00:09:57.148 11:46:24 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:09:57.148 11:46:24 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:09:57.148 11:46:24 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:57.148 11:46:24 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:57.148 11:46:24 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:57.148 11:46:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:57.148 ************************************ 00:09:57.148 START TEST raid_state_function_test 00:09:57.148 ************************************ 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 false 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1663562 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1663562' 00:09:57.148 Process raid pid: 1663562 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1663562 /var/tmp/spdk-raid.sock 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1663562 ']' 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:57.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:57.148 11:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:57.148 [2024-05-14 11:46:24.115610] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:09:57.148 [2024-05-14 11:46:24.115672] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:57.406 [2024-05-14 11:46:24.236286] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.406 [2024-05-14 11:46:24.334314] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.406 [2024-05-14 11:46:24.399396] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.407 [2024-05-14 11:46:24.399437] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.972 11:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:57.972 11:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:09:57.972 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:58.230 [2024-05-14 11:46:25.269134] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:58.230 [2024-05-14 11:46:25.269179] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:58.230 [2024-05-14 11:46:25.269190] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:58.230 [2024-05-14 11:46:25.269202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:58.230 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:58.530 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:58.530 "name": "Existed_Raid", 00:09:58.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:58.530 "strip_size_kb": 64, 00:09:58.530 "state": "configuring", 00:09:58.530 "raid_level": "raid0", 00:09:58.530 "superblock": false, 00:09:58.530 "num_base_bdevs": 2, 00:09:58.530 "num_base_bdevs_discovered": 0, 00:09:58.530 "num_base_bdevs_operational": 2, 00:09:58.530 "base_bdevs_list": [ 00:09:58.530 { 00:09:58.530 "name": "BaseBdev1", 00:09:58.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:58.530 "is_configured": false, 00:09:58.530 "data_offset": 0, 00:09:58.530 "data_size": 0 00:09:58.530 }, 00:09:58.530 { 00:09:58.530 "name": "BaseBdev2", 00:09:58.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:58.530 "is_configured": false, 00:09:58.530 "data_offset": 0, 00:09:58.530 "data_size": 0 00:09:58.530 } 00:09:58.530 ] 00:09:58.530 }' 00:09:58.530 11:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:58.530 11:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:59.127 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:59.385 [2024-05-14 11:46:26.311755] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:59.385 [2024-05-14 11:46:26.311785] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22a0700 name Existed_Raid, state configuring 00:09:59.385 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:59.644 [2024-05-14 11:46:26.560429] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:59.644 [2024-05-14 11:46:26.560458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:59.644 [2024-05-14 11:46:26.560468] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:59.644 [2024-05-14 11:46:26.560480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:59.644 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:59.902 [2024-05-14 11:46:26.815029] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:59.902 BaseBdev1 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:59.902 11:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:00.160 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:00.418 [ 00:10:00.418 { 00:10:00.418 "name": "BaseBdev1", 00:10:00.418 "aliases": [ 00:10:00.418 "71e916f8-08be-4fbe-a4a7-13ab034bb414" 00:10:00.418 ], 00:10:00.418 "product_name": "Malloc disk", 00:10:00.418 "block_size": 512, 00:10:00.418 "num_blocks": 65536, 00:10:00.418 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:00.418 "assigned_rate_limits": { 00:10:00.418 "rw_ios_per_sec": 0, 00:10:00.419 "rw_mbytes_per_sec": 0, 00:10:00.419 "r_mbytes_per_sec": 0, 00:10:00.419 "w_mbytes_per_sec": 0 00:10:00.419 }, 00:10:00.419 "claimed": true, 00:10:00.419 "claim_type": "exclusive_write", 00:10:00.419 "zoned": false, 00:10:00.419 "supported_io_types": { 00:10:00.419 "read": true, 00:10:00.419 "write": true, 00:10:00.419 "unmap": true, 00:10:00.419 "write_zeroes": true, 00:10:00.419 "flush": true, 00:10:00.419 "reset": true, 00:10:00.419 "compare": false, 00:10:00.419 "compare_and_write": false, 00:10:00.419 "abort": true, 00:10:00.419 "nvme_admin": false, 00:10:00.419 "nvme_io": false 00:10:00.419 }, 00:10:00.419 "memory_domains": [ 00:10:00.419 { 00:10:00.419 "dma_device_id": "system", 00:10:00.419 "dma_device_type": 1 00:10:00.419 }, 00:10:00.419 { 00:10:00.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:00.419 "dma_device_type": 2 00:10:00.419 } 00:10:00.419 ], 00:10:00.419 "driver_specific": {} 00:10:00.419 } 00:10:00.419 ] 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.419 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:00.678 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:00.678 "name": "Existed_Raid", 00:10:00.678 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:00.678 "strip_size_kb": 64, 00:10:00.678 "state": "configuring", 00:10:00.678 "raid_level": "raid0", 00:10:00.678 "superblock": false, 00:10:00.678 "num_base_bdevs": 2, 00:10:00.678 "num_base_bdevs_discovered": 1, 00:10:00.678 "num_base_bdevs_operational": 2, 00:10:00.678 "base_bdevs_list": [ 00:10:00.678 { 00:10:00.678 "name": "BaseBdev1", 00:10:00.678 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:00.678 "is_configured": true, 00:10:00.678 "data_offset": 0, 00:10:00.678 "data_size": 65536 00:10:00.678 }, 00:10:00.678 { 00:10:00.678 "name": "BaseBdev2", 00:10:00.678 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:00.678 "is_configured": false, 00:10:00.678 "data_offset": 0, 00:10:00.678 "data_size": 0 00:10:00.678 } 00:10:00.678 ] 00:10:00.678 }' 00:10:00.678 11:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:00.678 11:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.244 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:01.503 [2024-05-14 11:46:28.367152] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:01.503 [2024-05-14 11:46:28.367196] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22a09a0 name Existed_Raid, state configuring 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:01.503 [2024-05-14 11:46:28.543649] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:01.503 [2024-05-14 11:46:28.545137] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:01.503 [2024-05-14 11:46:28.545174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.503 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:01.761 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:01.761 "name": "Existed_Raid", 00:10:01.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:01.761 "strip_size_kb": 64, 00:10:01.761 "state": "configuring", 00:10:01.761 "raid_level": "raid0", 00:10:01.761 "superblock": false, 00:10:01.761 "num_base_bdevs": 2, 00:10:01.761 "num_base_bdevs_discovered": 1, 00:10:01.761 "num_base_bdevs_operational": 2, 00:10:01.761 "base_bdevs_list": [ 00:10:01.761 { 00:10:01.761 "name": "BaseBdev1", 00:10:01.761 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:01.761 "is_configured": true, 00:10:01.761 "data_offset": 0, 00:10:01.761 "data_size": 65536 00:10:01.761 }, 00:10:01.761 { 00:10:01.761 "name": "BaseBdev2", 00:10:01.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:01.761 "is_configured": false, 00:10:01.761 "data_offset": 0, 00:10:01.761 "data_size": 0 00:10:01.761 } 00:10:01.761 ] 00:10:01.761 }' 00:10:01.761 11:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:01.761 11:46:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:02.695 [2024-05-14 11:46:29.647293] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:02.695 [2024-05-14 11:46:29.647333] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x229fff0 00:10:02.695 [2024-05-14 11:46:29.647342] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:02.695 [2024-05-14 11:46:29.647542] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a2810 00:10:02.695 [2024-05-14 11:46:29.647662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x229fff0 00:10:02.695 [2024-05-14 11:46:29.647672] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x229fff0 00:10:02.695 [2024-05-14 11:46:29.647839] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:02.695 BaseBdev2 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:02.695 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:02.953 11:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:03.211 [ 00:10:03.211 { 00:10:03.211 "name": "BaseBdev2", 00:10:03.211 "aliases": [ 00:10:03.211 "97fa9dfe-4d00-4888-8146-6cf253ae12f1" 00:10:03.211 ], 00:10:03.211 "product_name": "Malloc disk", 00:10:03.211 "block_size": 512, 00:10:03.211 "num_blocks": 65536, 00:10:03.211 "uuid": "97fa9dfe-4d00-4888-8146-6cf253ae12f1", 00:10:03.211 "assigned_rate_limits": { 00:10:03.211 "rw_ios_per_sec": 0, 00:10:03.211 "rw_mbytes_per_sec": 0, 00:10:03.211 "r_mbytes_per_sec": 0, 00:10:03.211 "w_mbytes_per_sec": 0 00:10:03.211 }, 00:10:03.211 "claimed": true, 00:10:03.211 "claim_type": "exclusive_write", 00:10:03.211 "zoned": false, 00:10:03.211 "supported_io_types": { 00:10:03.211 "read": true, 00:10:03.211 "write": true, 00:10:03.211 "unmap": true, 00:10:03.211 "write_zeroes": true, 00:10:03.211 "flush": true, 00:10:03.211 "reset": true, 00:10:03.211 "compare": false, 00:10:03.211 "compare_and_write": false, 00:10:03.211 "abort": true, 00:10:03.211 "nvme_admin": false, 00:10:03.211 "nvme_io": false 00:10:03.211 }, 00:10:03.211 "memory_domains": [ 00:10:03.211 { 00:10:03.211 "dma_device_id": "system", 00:10:03.211 "dma_device_type": 1 00:10:03.211 }, 00:10:03.211 { 00:10:03.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:03.211 "dma_device_type": 2 00:10:03.211 } 00:10:03.211 ], 00:10:03.211 "driver_specific": {} 00:10:03.211 } 00:10:03.211 ] 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.211 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:03.469 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:03.469 "name": "Existed_Raid", 00:10:03.469 "uuid": "3fa16e47-6138-46cf-9375-a8c467374a62", 00:10:03.469 "strip_size_kb": 64, 00:10:03.469 "state": "online", 00:10:03.469 "raid_level": "raid0", 00:10:03.469 "superblock": false, 00:10:03.469 "num_base_bdevs": 2, 00:10:03.469 "num_base_bdevs_discovered": 2, 00:10:03.469 "num_base_bdevs_operational": 2, 00:10:03.469 "base_bdevs_list": [ 00:10:03.469 { 00:10:03.469 "name": "BaseBdev1", 00:10:03.469 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:03.469 "is_configured": true, 00:10:03.469 "data_offset": 0, 00:10:03.469 "data_size": 65536 00:10:03.469 }, 00:10:03.469 { 00:10:03.469 "name": "BaseBdev2", 00:10:03.469 "uuid": "97fa9dfe-4d00-4888-8146-6cf253ae12f1", 00:10:03.469 "is_configured": true, 00:10:03.469 "data_offset": 0, 00:10:03.469 "data_size": 65536 00:10:03.469 } 00:10:03.469 ] 00:10:03.469 }' 00:10:03.469 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:03.469 11:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:04.036 11:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:04.294 [2024-05-14 11:46:31.215856] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:04.294 "name": "Existed_Raid", 00:10:04.294 "aliases": [ 00:10:04.294 "3fa16e47-6138-46cf-9375-a8c467374a62" 00:10:04.294 ], 00:10:04.294 "product_name": "Raid Volume", 00:10:04.294 "block_size": 512, 00:10:04.294 "num_blocks": 131072, 00:10:04.294 "uuid": "3fa16e47-6138-46cf-9375-a8c467374a62", 00:10:04.294 "assigned_rate_limits": { 00:10:04.294 "rw_ios_per_sec": 0, 00:10:04.294 "rw_mbytes_per_sec": 0, 00:10:04.294 "r_mbytes_per_sec": 0, 00:10:04.294 "w_mbytes_per_sec": 0 00:10:04.294 }, 00:10:04.294 "claimed": false, 00:10:04.294 "zoned": false, 00:10:04.294 "supported_io_types": { 00:10:04.294 "read": true, 00:10:04.294 "write": true, 00:10:04.294 "unmap": true, 00:10:04.294 "write_zeroes": true, 00:10:04.294 "flush": true, 00:10:04.294 "reset": true, 00:10:04.294 "compare": false, 00:10:04.294 "compare_and_write": false, 00:10:04.294 "abort": false, 00:10:04.294 "nvme_admin": false, 00:10:04.294 "nvme_io": false 00:10:04.294 }, 00:10:04.294 "memory_domains": [ 00:10:04.294 { 00:10:04.294 "dma_device_id": "system", 00:10:04.294 "dma_device_type": 1 00:10:04.294 }, 00:10:04.294 { 00:10:04.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.294 "dma_device_type": 2 00:10:04.294 }, 00:10:04.294 { 00:10:04.294 "dma_device_id": "system", 00:10:04.294 "dma_device_type": 1 00:10:04.294 }, 00:10:04.294 { 00:10:04.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.294 "dma_device_type": 2 00:10:04.294 } 00:10:04.294 ], 00:10:04.294 "driver_specific": { 00:10:04.294 "raid": { 00:10:04.294 "uuid": "3fa16e47-6138-46cf-9375-a8c467374a62", 00:10:04.294 "strip_size_kb": 64, 00:10:04.294 "state": "online", 00:10:04.294 "raid_level": "raid0", 00:10:04.294 "superblock": false, 00:10:04.294 "num_base_bdevs": 2, 00:10:04.294 "num_base_bdevs_discovered": 2, 00:10:04.294 "num_base_bdevs_operational": 2, 00:10:04.294 "base_bdevs_list": [ 00:10:04.294 { 00:10:04.294 "name": "BaseBdev1", 00:10:04.294 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:04.294 "is_configured": true, 00:10:04.294 "data_offset": 0, 00:10:04.294 "data_size": 65536 00:10:04.294 }, 00:10:04.294 { 00:10:04.294 "name": "BaseBdev2", 00:10:04.294 "uuid": "97fa9dfe-4d00-4888-8146-6cf253ae12f1", 00:10:04.294 "is_configured": true, 00:10:04.294 "data_offset": 0, 00:10:04.294 "data_size": 65536 00:10:04.294 } 00:10:04.294 ] 00:10:04.294 } 00:10:04.294 } 00:10:04.294 }' 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:04.294 BaseBdev2' 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:04.294 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:04.553 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:04.553 "name": "BaseBdev1", 00:10:04.553 "aliases": [ 00:10:04.553 "71e916f8-08be-4fbe-a4a7-13ab034bb414" 00:10:04.553 ], 00:10:04.553 "product_name": "Malloc disk", 00:10:04.553 "block_size": 512, 00:10:04.553 "num_blocks": 65536, 00:10:04.553 "uuid": "71e916f8-08be-4fbe-a4a7-13ab034bb414", 00:10:04.553 "assigned_rate_limits": { 00:10:04.553 "rw_ios_per_sec": 0, 00:10:04.553 "rw_mbytes_per_sec": 0, 00:10:04.553 "r_mbytes_per_sec": 0, 00:10:04.553 "w_mbytes_per_sec": 0 00:10:04.553 }, 00:10:04.553 "claimed": true, 00:10:04.553 "claim_type": "exclusive_write", 00:10:04.553 "zoned": false, 00:10:04.553 "supported_io_types": { 00:10:04.553 "read": true, 00:10:04.553 "write": true, 00:10:04.553 "unmap": true, 00:10:04.553 "write_zeroes": true, 00:10:04.553 "flush": true, 00:10:04.553 "reset": true, 00:10:04.553 "compare": false, 00:10:04.553 "compare_and_write": false, 00:10:04.553 "abort": true, 00:10:04.553 "nvme_admin": false, 00:10:04.553 "nvme_io": false 00:10:04.553 }, 00:10:04.553 "memory_domains": [ 00:10:04.553 { 00:10:04.553 "dma_device_id": "system", 00:10:04.553 "dma_device_type": 1 00:10:04.553 }, 00:10:04.553 { 00:10:04.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.553 "dma_device_type": 2 00:10:04.553 } 00:10:04.553 ], 00:10:04.553 "driver_specific": {} 00:10:04.553 }' 00:10:04.553 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:04.553 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:04.553 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:04.553 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:04.811 11:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:05.069 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:05.069 "name": "BaseBdev2", 00:10:05.069 "aliases": [ 00:10:05.069 "97fa9dfe-4d00-4888-8146-6cf253ae12f1" 00:10:05.069 ], 00:10:05.069 "product_name": "Malloc disk", 00:10:05.069 "block_size": 512, 00:10:05.069 "num_blocks": 65536, 00:10:05.069 "uuid": "97fa9dfe-4d00-4888-8146-6cf253ae12f1", 00:10:05.069 "assigned_rate_limits": { 00:10:05.069 "rw_ios_per_sec": 0, 00:10:05.069 "rw_mbytes_per_sec": 0, 00:10:05.069 "r_mbytes_per_sec": 0, 00:10:05.069 "w_mbytes_per_sec": 0 00:10:05.069 }, 00:10:05.069 "claimed": true, 00:10:05.069 "claim_type": "exclusive_write", 00:10:05.069 "zoned": false, 00:10:05.069 "supported_io_types": { 00:10:05.069 "read": true, 00:10:05.069 "write": true, 00:10:05.069 "unmap": true, 00:10:05.069 "write_zeroes": true, 00:10:05.069 "flush": true, 00:10:05.069 "reset": true, 00:10:05.069 "compare": false, 00:10:05.069 "compare_and_write": false, 00:10:05.069 "abort": true, 00:10:05.069 "nvme_admin": false, 00:10:05.069 "nvme_io": false 00:10:05.069 }, 00:10:05.069 "memory_domains": [ 00:10:05.069 { 00:10:05.069 "dma_device_id": "system", 00:10:05.069 "dma_device_type": 1 00:10:05.069 }, 00:10:05.069 { 00:10:05.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.069 "dma_device_type": 2 00:10:05.069 } 00:10:05.069 ], 00:10:05.069 "driver_specific": {} 00:10:05.069 }' 00:10:05.069 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:05.069 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:05.328 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:05.585 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:05.585 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:05.585 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:05.842 [2024-05-14 11:46:32.675540] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:05.842 [2024-05-14 11:46:32.675568] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:05.842 [2024-05-14 11:46:32.675609] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:05.842 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:05.843 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.843 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:06.101 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:06.101 "name": "Existed_Raid", 00:10:06.101 "uuid": "3fa16e47-6138-46cf-9375-a8c467374a62", 00:10:06.101 "strip_size_kb": 64, 00:10:06.101 "state": "offline", 00:10:06.101 "raid_level": "raid0", 00:10:06.101 "superblock": false, 00:10:06.101 "num_base_bdevs": 2, 00:10:06.101 "num_base_bdevs_discovered": 1, 00:10:06.101 "num_base_bdevs_operational": 1, 00:10:06.101 "base_bdevs_list": [ 00:10:06.101 { 00:10:06.101 "name": null, 00:10:06.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:06.101 "is_configured": false, 00:10:06.101 "data_offset": 0, 00:10:06.101 "data_size": 65536 00:10:06.101 }, 00:10:06.101 { 00:10:06.101 "name": "BaseBdev2", 00:10:06.101 "uuid": "97fa9dfe-4d00-4888-8146-6cf253ae12f1", 00:10:06.101 "is_configured": true, 00:10:06.101 "data_offset": 0, 00:10:06.101 "data_size": 65536 00:10:06.101 } 00:10:06.101 ] 00:10:06.101 }' 00:10:06.101 11:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:06.101 11:46:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.712 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:06.712 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:06.712 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:06.712 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:06.970 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:06.970 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:06.970 11:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:06.970 [2024-05-14 11:46:34.024545] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:06.970 [2024-05-14 11:46:34.024599] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x229fff0 name Existed_Raid, state offline 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1663562 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1663562 ']' 00:10:07.227 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1663562 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1663562 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1663562' 00:10:07.486 killing process with pid 1663562 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1663562 00:10:07.486 [2024-05-14 11:46:34.358313] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:07.486 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1663562 00:10:07.486 [2024-05-14 11:46:34.359742] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:07.744 11:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:10:07.744 00:10:07.744 real 0m10.665s 00:10:07.744 user 0m18.867s 00:10:07.744 sys 0m1.921s 00:10:07.744 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:07.744 11:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:07.744 ************************************ 00:10:07.744 END TEST raid_state_function_test 00:10:07.744 ************************************ 00:10:07.744 11:46:34 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:07.744 11:46:34 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:07.744 11:46:34 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:07.744 11:46:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:07.744 ************************************ 00:10:07.744 START TEST raid_state_function_test_sb 00:10:07.745 ************************************ 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 true 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1665210 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1665210' 00:10:07.745 Process raid pid: 1665210 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1665210 /var/tmp/spdk-raid.sock 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1665210 ']' 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:07.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:07.745 11:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:08.003 [2024-05-14 11:46:34.862991] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:10:08.003 [2024-05-14 11:46:34.863053] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:08.003 [2024-05-14 11:46:34.994822] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.261 [2024-05-14 11:46:35.103123] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.261 [2024-05-14 11:46:35.168227] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:08.261 [2024-05-14 11:46:35.168250] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:08.827 11:46:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:08.827 11:46:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:10:08.827 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:09.086 [2024-05-14 11:46:35.958617] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:09.086 [2024-05-14 11:46:35.958661] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:09.086 [2024-05-14 11:46:35.958672] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:09.086 [2024-05-14 11:46:35.958684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:09.086 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:09.087 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:09.087 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:09.087 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:09.087 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.087 11:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:09.345 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:09.345 "name": "Existed_Raid", 00:10:09.345 "uuid": "ba153f5b-f3f8-4958-bdf6-96792656d519", 00:10:09.345 "strip_size_kb": 64, 00:10:09.345 "state": "configuring", 00:10:09.345 "raid_level": "raid0", 00:10:09.345 "superblock": true, 00:10:09.345 "num_base_bdevs": 2, 00:10:09.345 "num_base_bdevs_discovered": 0, 00:10:09.345 "num_base_bdevs_operational": 2, 00:10:09.345 "base_bdevs_list": [ 00:10:09.345 { 00:10:09.345 "name": "BaseBdev1", 00:10:09.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.345 "is_configured": false, 00:10:09.345 "data_offset": 0, 00:10:09.345 "data_size": 0 00:10:09.345 }, 00:10:09.345 { 00:10:09.345 "name": "BaseBdev2", 00:10:09.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.345 "is_configured": false, 00:10:09.345 "data_offset": 0, 00:10:09.345 "data_size": 0 00:10:09.345 } 00:10:09.345 ] 00:10:09.345 }' 00:10:09.345 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:09.345 11:46:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:09.912 11:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:10.170 [2024-05-14 11:46:37.061386] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:10.170 [2024-05-14 11:46:37.061423] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd1700 name Existed_Raid, state configuring 00:10:10.170 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:10.428 [2024-05-14 11:46:37.314074] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:10.428 [2024-05-14 11:46:37.314105] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:10.428 [2024-05-14 11:46:37.314115] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:10.428 [2024-05-14 11:46:37.314127] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:10.428 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:10.686 [2024-05-14 11:46:37.576348] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:10.686 BaseBdev1 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:10.686 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:10.943 11:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:11.202 [ 00:10:11.202 { 00:10:11.202 "name": "BaseBdev1", 00:10:11.202 "aliases": [ 00:10:11.202 "9bd4117d-410c-4ce3-9ec6-8979c56c8d56" 00:10:11.202 ], 00:10:11.202 "product_name": "Malloc disk", 00:10:11.202 "block_size": 512, 00:10:11.202 "num_blocks": 65536, 00:10:11.202 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:11.202 "assigned_rate_limits": { 00:10:11.202 "rw_ios_per_sec": 0, 00:10:11.202 "rw_mbytes_per_sec": 0, 00:10:11.202 "r_mbytes_per_sec": 0, 00:10:11.202 "w_mbytes_per_sec": 0 00:10:11.202 }, 00:10:11.202 "claimed": true, 00:10:11.202 "claim_type": "exclusive_write", 00:10:11.202 "zoned": false, 00:10:11.202 "supported_io_types": { 00:10:11.202 "read": true, 00:10:11.202 "write": true, 00:10:11.202 "unmap": true, 00:10:11.202 "write_zeroes": true, 00:10:11.202 "flush": true, 00:10:11.202 "reset": true, 00:10:11.202 "compare": false, 00:10:11.202 "compare_and_write": false, 00:10:11.202 "abort": true, 00:10:11.202 "nvme_admin": false, 00:10:11.202 "nvme_io": false 00:10:11.202 }, 00:10:11.202 "memory_domains": [ 00:10:11.202 { 00:10:11.202 "dma_device_id": "system", 00:10:11.202 "dma_device_type": 1 00:10:11.202 }, 00:10:11.202 { 00:10:11.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:11.202 "dma_device_type": 2 00:10:11.202 } 00:10:11.202 ], 00:10:11.202 "driver_specific": {} 00:10:11.202 } 00:10:11.202 ] 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:11.202 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:11.461 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:11.461 "name": "Existed_Raid", 00:10:11.461 "uuid": "9f1f69f6-1b65-4839-b67c-be577f9a560b", 00:10:11.461 "strip_size_kb": 64, 00:10:11.461 "state": "configuring", 00:10:11.461 "raid_level": "raid0", 00:10:11.461 "superblock": true, 00:10:11.461 "num_base_bdevs": 2, 00:10:11.461 "num_base_bdevs_discovered": 1, 00:10:11.461 "num_base_bdevs_operational": 2, 00:10:11.461 "base_bdevs_list": [ 00:10:11.461 { 00:10:11.461 "name": "BaseBdev1", 00:10:11.461 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:11.461 "is_configured": true, 00:10:11.461 "data_offset": 2048, 00:10:11.461 "data_size": 63488 00:10:11.461 }, 00:10:11.461 { 00:10:11.461 "name": "BaseBdev2", 00:10:11.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:11.461 "is_configured": false, 00:10:11.461 "data_offset": 0, 00:10:11.461 "data_size": 0 00:10:11.461 } 00:10:11.461 ] 00:10:11.461 }' 00:10:11.461 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:11.461 11:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:12.030 11:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:12.290 [2024-05-14 11:46:39.128457] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:12.290 [2024-05-14 11:46:39.128495] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd19a0 name Existed_Raid, state configuring 00:10:12.290 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:12.550 [2024-05-14 11:46:39.381161] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:12.550 [2024-05-14 11:46:39.382642] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:12.550 [2024-05-14 11:46:39.382671] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:12.550 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:12.808 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:12.808 "name": "Existed_Raid", 00:10:12.808 "uuid": "6220670d-a370-4fa4-8fa1-838725d41a0e", 00:10:12.808 "strip_size_kb": 64, 00:10:12.808 "state": "configuring", 00:10:12.808 "raid_level": "raid0", 00:10:12.808 "superblock": true, 00:10:12.808 "num_base_bdevs": 2, 00:10:12.808 "num_base_bdevs_discovered": 1, 00:10:12.808 "num_base_bdevs_operational": 2, 00:10:12.808 "base_bdevs_list": [ 00:10:12.808 { 00:10:12.808 "name": "BaseBdev1", 00:10:12.808 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:12.808 "is_configured": true, 00:10:12.808 "data_offset": 2048, 00:10:12.808 "data_size": 63488 00:10:12.808 }, 00:10:12.808 { 00:10:12.808 "name": "BaseBdev2", 00:10:12.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:12.808 "is_configured": false, 00:10:12.808 "data_offset": 0, 00:10:12.808 "data_size": 0 00:10:12.808 } 00:10:12.808 ] 00:10:12.808 }' 00:10:12.808 11:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:12.808 11:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:13.376 [2024-05-14 11:46:40.408281] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:13.376 [2024-05-14 11:46:40.408447] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xfd0ff0 00:10:13.376 [2024-05-14 11:46:40.408462] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:13.376 [2024-05-14 11:46:40.408636] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd3810 00:10:13.376 [2024-05-14 11:46:40.408752] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfd0ff0 00:10:13.376 [2024-05-14 11:46:40.408761] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfd0ff0 00:10:13.376 [2024-05-14 11:46:40.408850] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:13.376 BaseBdev2 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:13.376 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:13.635 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:13.893 [ 00:10:13.893 { 00:10:13.893 "name": "BaseBdev2", 00:10:13.893 "aliases": [ 00:10:13.893 "4cba3400-7e9b-4464-b7bc-e5af0791df4e" 00:10:13.893 ], 00:10:13.893 "product_name": "Malloc disk", 00:10:13.893 "block_size": 512, 00:10:13.893 "num_blocks": 65536, 00:10:13.893 "uuid": "4cba3400-7e9b-4464-b7bc-e5af0791df4e", 00:10:13.893 "assigned_rate_limits": { 00:10:13.893 "rw_ios_per_sec": 0, 00:10:13.893 "rw_mbytes_per_sec": 0, 00:10:13.893 "r_mbytes_per_sec": 0, 00:10:13.893 "w_mbytes_per_sec": 0 00:10:13.893 }, 00:10:13.893 "claimed": true, 00:10:13.893 "claim_type": "exclusive_write", 00:10:13.893 "zoned": false, 00:10:13.893 "supported_io_types": { 00:10:13.893 "read": true, 00:10:13.893 "write": true, 00:10:13.893 "unmap": true, 00:10:13.893 "write_zeroes": true, 00:10:13.893 "flush": true, 00:10:13.893 "reset": true, 00:10:13.893 "compare": false, 00:10:13.893 "compare_and_write": false, 00:10:13.893 "abort": true, 00:10:13.893 "nvme_admin": false, 00:10:13.893 "nvme_io": false 00:10:13.893 }, 00:10:13.893 "memory_domains": [ 00:10:13.893 { 00:10:13.893 "dma_device_id": "system", 00:10:13.893 "dma_device_type": 1 00:10:13.893 }, 00:10:13.893 { 00:10:13.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.893 "dma_device_type": 2 00:10:13.893 } 00:10:13.893 ], 00:10:13.893 "driver_specific": {} 00:10:13.893 } 00:10:13.893 ] 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.893 11:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:14.152 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:14.152 "name": "Existed_Raid", 00:10:14.152 "uuid": "6220670d-a370-4fa4-8fa1-838725d41a0e", 00:10:14.152 "strip_size_kb": 64, 00:10:14.152 "state": "online", 00:10:14.152 "raid_level": "raid0", 00:10:14.152 "superblock": true, 00:10:14.152 "num_base_bdevs": 2, 00:10:14.152 "num_base_bdevs_discovered": 2, 00:10:14.152 "num_base_bdevs_operational": 2, 00:10:14.152 "base_bdevs_list": [ 00:10:14.152 { 00:10:14.152 "name": "BaseBdev1", 00:10:14.152 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:14.152 "is_configured": true, 00:10:14.152 "data_offset": 2048, 00:10:14.152 "data_size": 63488 00:10:14.152 }, 00:10:14.152 { 00:10:14.152 "name": "BaseBdev2", 00:10:14.152 "uuid": "4cba3400-7e9b-4464-b7bc-e5af0791df4e", 00:10:14.152 "is_configured": true, 00:10:14.152 "data_offset": 2048, 00:10:14.152 "data_size": 63488 00:10:14.152 } 00:10:14.152 ] 00:10:14.152 }' 00:10:14.152 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:14.152 11:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:14.718 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:14.976 [2024-05-14 11:46:41.952625] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:14.976 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:14.976 "name": "Existed_Raid", 00:10:14.976 "aliases": [ 00:10:14.976 "6220670d-a370-4fa4-8fa1-838725d41a0e" 00:10:14.976 ], 00:10:14.976 "product_name": "Raid Volume", 00:10:14.976 "block_size": 512, 00:10:14.976 "num_blocks": 126976, 00:10:14.976 "uuid": "6220670d-a370-4fa4-8fa1-838725d41a0e", 00:10:14.976 "assigned_rate_limits": { 00:10:14.976 "rw_ios_per_sec": 0, 00:10:14.976 "rw_mbytes_per_sec": 0, 00:10:14.976 "r_mbytes_per_sec": 0, 00:10:14.976 "w_mbytes_per_sec": 0 00:10:14.976 }, 00:10:14.976 "claimed": false, 00:10:14.976 "zoned": false, 00:10:14.976 "supported_io_types": { 00:10:14.976 "read": true, 00:10:14.976 "write": true, 00:10:14.976 "unmap": true, 00:10:14.976 "write_zeroes": true, 00:10:14.976 "flush": true, 00:10:14.976 "reset": true, 00:10:14.976 "compare": false, 00:10:14.976 "compare_and_write": false, 00:10:14.976 "abort": false, 00:10:14.976 "nvme_admin": false, 00:10:14.976 "nvme_io": false 00:10:14.976 }, 00:10:14.976 "memory_domains": [ 00:10:14.976 { 00:10:14.976 "dma_device_id": "system", 00:10:14.976 "dma_device_type": 1 00:10:14.976 }, 00:10:14.976 { 00:10:14.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.976 "dma_device_type": 2 00:10:14.976 }, 00:10:14.976 { 00:10:14.976 "dma_device_id": "system", 00:10:14.976 "dma_device_type": 1 00:10:14.976 }, 00:10:14.976 { 00:10:14.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.976 "dma_device_type": 2 00:10:14.976 } 00:10:14.976 ], 00:10:14.976 "driver_specific": { 00:10:14.976 "raid": { 00:10:14.976 "uuid": "6220670d-a370-4fa4-8fa1-838725d41a0e", 00:10:14.976 "strip_size_kb": 64, 00:10:14.976 "state": "online", 00:10:14.976 "raid_level": "raid0", 00:10:14.976 "superblock": true, 00:10:14.976 "num_base_bdevs": 2, 00:10:14.976 "num_base_bdevs_discovered": 2, 00:10:14.976 "num_base_bdevs_operational": 2, 00:10:14.976 "base_bdevs_list": [ 00:10:14.976 { 00:10:14.976 "name": "BaseBdev1", 00:10:14.976 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:14.976 "is_configured": true, 00:10:14.976 "data_offset": 2048, 00:10:14.976 "data_size": 63488 00:10:14.976 }, 00:10:14.976 { 00:10:14.976 "name": "BaseBdev2", 00:10:14.976 "uuid": "4cba3400-7e9b-4464-b7bc-e5af0791df4e", 00:10:14.976 "is_configured": true, 00:10:14.976 "data_offset": 2048, 00:10:14.976 "data_size": 63488 00:10:14.976 } 00:10:14.976 ] 00:10:14.976 } 00:10:14.976 } 00:10:14.976 }' 00:10:14.976 11:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:14.976 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:14.976 BaseBdev2' 00:10:14.976 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:14.976 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:14.976 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:15.234 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:15.234 "name": "BaseBdev1", 00:10:15.234 "aliases": [ 00:10:15.234 "9bd4117d-410c-4ce3-9ec6-8979c56c8d56" 00:10:15.234 ], 00:10:15.234 "product_name": "Malloc disk", 00:10:15.234 "block_size": 512, 00:10:15.234 "num_blocks": 65536, 00:10:15.234 "uuid": "9bd4117d-410c-4ce3-9ec6-8979c56c8d56", 00:10:15.234 "assigned_rate_limits": { 00:10:15.234 "rw_ios_per_sec": 0, 00:10:15.234 "rw_mbytes_per_sec": 0, 00:10:15.234 "r_mbytes_per_sec": 0, 00:10:15.234 "w_mbytes_per_sec": 0 00:10:15.234 }, 00:10:15.234 "claimed": true, 00:10:15.234 "claim_type": "exclusive_write", 00:10:15.234 "zoned": false, 00:10:15.234 "supported_io_types": { 00:10:15.234 "read": true, 00:10:15.234 "write": true, 00:10:15.234 "unmap": true, 00:10:15.234 "write_zeroes": true, 00:10:15.234 "flush": true, 00:10:15.234 "reset": true, 00:10:15.234 "compare": false, 00:10:15.234 "compare_and_write": false, 00:10:15.234 "abort": true, 00:10:15.234 "nvme_admin": false, 00:10:15.234 "nvme_io": false 00:10:15.234 }, 00:10:15.234 "memory_domains": [ 00:10:15.234 { 00:10:15.234 "dma_device_id": "system", 00:10:15.234 "dma_device_type": 1 00:10:15.234 }, 00:10:15.234 { 00:10:15.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.234 "dma_device_type": 2 00:10:15.234 } 00:10:15.234 ], 00:10:15.234 "driver_specific": {} 00:10:15.234 }' 00:10:15.235 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:15.235 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:15.520 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:15.792 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:15.792 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:15.792 "name": "BaseBdev2", 00:10:15.792 "aliases": [ 00:10:15.792 "4cba3400-7e9b-4464-b7bc-e5af0791df4e" 00:10:15.792 ], 00:10:15.792 "product_name": "Malloc disk", 00:10:15.792 "block_size": 512, 00:10:15.792 "num_blocks": 65536, 00:10:15.792 "uuid": "4cba3400-7e9b-4464-b7bc-e5af0791df4e", 00:10:15.792 "assigned_rate_limits": { 00:10:15.792 "rw_ios_per_sec": 0, 00:10:15.792 "rw_mbytes_per_sec": 0, 00:10:15.792 "r_mbytes_per_sec": 0, 00:10:15.792 "w_mbytes_per_sec": 0 00:10:15.792 }, 00:10:15.792 "claimed": true, 00:10:15.792 "claim_type": "exclusive_write", 00:10:15.792 "zoned": false, 00:10:15.792 "supported_io_types": { 00:10:15.792 "read": true, 00:10:15.792 "write": true, 00:10:15.792 "unmap": true, 00:10:15.792 "write_zeroes": true, 00:10:15.792 "flush": true, 00:10:15.792 "reset": true, 00:10:15.792 "compare": false, 00:10:15.792 "compare_and_write": false, 00:10:15.792 "abort": true, 00:10:15.792 "nvme_admin": false, 00:10:15.792 "nvme_io": false 00:10:15.792 }, 00:10:15.792 "memory_domains": [ 00:10:15.792 { 00:10:15.792 "dma_device_id": "system", 00:10:15.792 "dma_device_type": 1 00:10:15.792 }, 00:10:15.792 { 00:10:15.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.792 "dma_device_type": 2 00:10:15.792 } 00:10:15.792 ], 00:10:15.792 "driver_specific": {} 00:10:15.792 }' 00:10:15.792 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:16.051 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:16.051 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:16.051 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:16.051 11:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:16.051 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:16.051 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:16.051 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:16.051 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:16.051 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:16.309 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:16.309 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:16.309 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:16.567 [2024-05-14 11:46:43.416322] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:16.567 [2024-05-14 11:46:43.416349] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:16.567 [2024-05-14 11:46:43.416392] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.567 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:16.826 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:16.826 "name": "Existed_Raid", 00:10:16.826 "uuid": "6220670d-a370-4fa4-8fa1-838725d41a0e", 00:10:16.826 "strip_size_kb": 64, 00:10:16.826 "state": "offline", 00:10:16.826 "raid_level": "raid0", 00:10:16.826 "superblock": true, 00:10:16.826 "num_base_bdevs": 2, 00:10:16.826 "num_base_bdevs_discovered": 1, 00:10:16.826 "num_base_bdevs_operational": 1, 00:10:16.826 "base_bdevs_list": [ 00:10:16.826 { 00:10:16.826 "name": null, 00:10:16.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:16.826 "is_configured": false, 00:10:16.826 "data_offset": 2048, 00:10:16.826 "data_size": 63488 00:10:16.826 }, 00:10:16.826 { 00:10:16.826 "name": "BaseBdev2", 00:10:16.826 "uuid": "4cba3400-7e9b-4464-b7bc-e5af0791df4e", 00:10:16.826 "is_configured": true, 00:10:16.826 "data_offset": 2048, 00:10:16.826 "data_size": 63488 00:10:16.826 } 00:10:16.826 ] 00:10:16.826 }' 00:10:16.826 11:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:16.826 11:46:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:17.392 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:17.392 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:17.392 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:17.392 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.651 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:17.651 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:17.651 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:17.651 [2024-05-14 11:46:44.729649] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:17.651 [2024-05-14 11:46:44.729699] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd0ff0 name Existed_Raid, state offline 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1665210 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1665210 ']' 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1665210 00:10:17.909 11:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:10:18.167 11:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:18.167 11:46:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1665210 00:10:18.167 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:18.167 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:18.167 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1665210' 00:10:18.167 killing process with pid 1665210 00:10:18.167 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1665210 00:10:18.167 [2024-05-14 11:46:45.039126] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:18.167 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1665210 00:10:18.167 [2024-05-14 11:46:45.040096] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:18.426 11:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:10:18.426 00:10:18.426 real 0m10.461s 00:10:18.426 user 0m18.633s 00:10:18.426 sys 0m1.904s 00:10:18.426 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:18.426 11:46:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:18.426 ************************************ 00:10:18.426 END TEST raid_state_function_test_sb 00:10:18.426 ************************************ 00:10:18.426 11:46:45 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:18.426 11:46:45 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:10:18.426 11:46:45 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:18.426 11:46:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:18.426 ************************************ 00:10:18.426 START TEST raid_superblock_test 00:10:18.426 ************************************ 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 2 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1666781 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1666781 /var/tmp/spdk-raid.sock 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1666781 ']' 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:18.426 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:18.426 11:46:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.426 [2024-05-14 11:46:45.419765] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:10:18.426 [2024-05-14 11:46:45.419834] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1666781 ] 00:10:18.685 [2024-05-14 11:46:45.548243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.685 [2024-05-14 11:46:45.656688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.685 [2024-05-14 11:46:45.724220] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.685 [2024-05-14 11:46:45.724256] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:19.620 malloc1 00:10:19.620 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:19.879 [2024-05-14 11:46:46.817956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:19.879 [2024-05-14 11:46:46.818007] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:19.879 [2024-05-14 11:46:46.818029] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d402a0 00:10:19.879 [2024-05-14 11:46:46.818042] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:19.879 [2024-05-14 11:46:46.819772] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:19.879 [2024-05-14 11:46:46.819803] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:19.879 pt1 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:19.879 11:46:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:20.139 malloc2 00:10:20.139 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:20.398 [2024-05-14 11:46:47.312259] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:20.398 [2024-05-14 11:46:47.312305] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:20.398 [2024-05-14 11:46:47.312327] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef3480 00:10:20.398 [2024-05-14 11:46:47.312340] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:20.398 [2024-05-14 11:46:47.313763] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:20.398 [2024-05-14 11:46:47.313792] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:20.398 pt2 00:10:20.398 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:20.398 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:20.398 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:20.658 [2024-05-14 11:46:47.556927] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:20.658 [2024-05-14 11:46:47.558129] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:20.658 [2024-05-14 11:46:47.558267] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ee92c0 00:10:20.658 [2024-05-14 11:46:47.558280] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:20.658 [2024-05-14 11:46:47.558478] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d3ff70 00:10:20.658 [2024-05-14 11:46:47.558620] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ee92c0 00:10:20.658 [2024-05-14 11:46:47.558630] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ee92c0 00:10:20.658 [2024-05-14 11:46:47.558723] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.658 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:20.917 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:20.917 "name": "raid_bdev1", 00:10:20.917 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:20.917 "strip_size_kb": 64, 00:10:20.917 "state": "online", 00:10:20.917 "raid_level": "raid0", 00:10:20.917 "superblock": true, 00:10:20.917 "num_base_bdevs": 2, 00:10:20.917 "num_base_bdevs_discovered": 2, 00:10:20.917 "num_base_bdevs_operational": 2, 00:10:20.917 "base_bdevs_list": [ 00:10:20.917 { 00:10:20.917 "name": "pt1", 00:10:20.917 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:20.917 "is_configured": true, 00:10:20.917 "data_offset": 2048, 00:10:20.917 "data_size": 63488 00:10:20.917 }, 00:10:20.917 { 00:10:20.917 "name": "pt2", 00:10:20.917 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:20.917 "is_configured": true, 00:10:20.917 "data_offset": 2048, 00:10:20.917 "data_size": 63488 00:10:20.917 } 00:10:20.917 ] 00:10:20.917 }' 00:10:20.917 11:46:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:20.917 11:46:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:21.485 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:21.744 [2024-05-14 11:46:48.635968] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:21.744 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:21.744 "name": "raid_bdev1", 00:10:21.744 "aliases": [ 00:10:21.744 "245d5bc7-de96-4da7-abe2-fb134eb9a832" 00:10:21.744 ], 00:10:21.744 "product_name": "Raid Volume", 00:10:21.744 "block_size": 512, 00:10:21.744 "num_blocks": 126976, 00:10:21.744 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:21.744 "assigned_rate_limits": { 00:10:21.744 "rw_ios_per_sec": 0, 00:10:21.744 "rw_mbytes_per_sec": 0, 00:10:21.744 "r_mbytes_per_sec": 0, 00:10:21.744 "w_mbytes_per_sec": 0 00:10:21.744 }, 00:10:21.744 "claimed": false, 00:10:21.744 "zoned": false, 00:10:21.744 "supported_io_types": { 00:10:21.744 "read": true, 00:10:21.744 "write": true, 00:10:21.744 "unmap": true, 00:10:21.744 "write_zeroes": true, 00:10:21.744 "flush": true, 00:10:21.744 "reset": true, 00:10:21.744 "compare": false, 00:10:21.744 "compare_and_write": false, 00:10:21.744 "abort": false, 00:10:21.744 "nvme_admin": false, 00:10:21.744 "nvme_io": false 00:10:21.744 }, 00:10:21.744 "memory_domains": [ 00:10:21.744 { 00:10:21.744 "dma_device_id": "system", 00:10:21.744 "dma_device_type": 1 00:10:21.744 }, 00:10:21.744 { 00:10:21.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.744 "dma_device_type": 2 00:10:21.744 }, 00:10:21.744 { 00:10:21.744 "dma_device_id": "system", 00:10:21.744 "dma_device_type": 1 00:10:21.744 }, 00:10:21.744 { 00:10:21.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.745 "dma_device_type": 2 00:10:21.745 } 00:10:21.745 ], 00:10:21.745 "driver_specific": { 00:10:21.745 "raid": { 00:10:21.745 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:21.745 "strip_size_kb": 64, 00:10:21.745 "state": "online", 00:10:21.745 "raid_level": "raid0", 00:10:21.745 "superblock": true, 00:10:21.745 "num_base_bdevs": 2, 00:10:21.745 "num_base_bdevs_discovered": 2, 00:10:21.745 "num_base_bdevs_operational": 2, 00:10:21.745 "base_bdevs_list": [ 00:10:21.745 { 00:10:21.745 "name": "pt1", 00:10:21.745 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:21.745 "is_configured": true, 00:10:21.745 "data_offset": 2048, 00:10:21.745 "data_size": 63488 00:10:21.745 }, 00:10:21.745 { 00:10:21.745 "name": "pt2", 00:10:21.745 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:21.745 "is_configured": true, 00:10:21.745 "data_offset": 2048, 00:10:21.745 "data_size": 63488 00:10:21.745 } 00:10:21.745 ] 00:10:21.745 } 00:10:21.745 } 00:10:21.745 }' 00:10:21.745 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:21.745 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:21.745 pt2' 00:10:21.745 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:21.745 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:21.745 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:22.003 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:22.003 "name": "pt1", 00:10:22.003 "aliases": [ 00:10:22.003 "01c7e21d-c8bf-5a52-97d8-68f8a94ae873" 00:10:22.003 ], 00:10:22.003 "product_name": "passthru", 00:10:22.003 "block_size": 512, 00:10:22.003 "num_blocks": 65536, 00:10:22.003 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:22.003 "assigned_rate_limits": { 00:10:22.003 "rw_ios_per_sec": 0, 00:10:22.003 "rw_mbytes_per_sec": 0, 00:10:22.003 "r_mbytes_per_sec": 0, 00:10:22.003 "w_mbytes_per_sec": 0 00:10:22.003 }, 00:10:22.003 "claimed": true, 00:10:22.003 "claim_type": "exclusive_write", 00:10:22.003 "zoned": false, 00:10:22.003 "supported_io_types": { 00:10:22.003 "read": true, 00:10:22.003 "write": true, 00:10:22.003 "unmap": true, 00:10:22.003 "write_zeroes": true, 00:10:22.003 "flush": true, 00:10:22.003 "reset": true, 00:10:22.003 "compare": false, 00:10:22.003 "compare_and_write": false, 00:10:22.003 "abort": true, 00:10:22.003 "nvme_admin": false, 00:10:22.003 "nvme_io": false 00:10:22.003 }, 00:10:22.003 "memory_domains": [ 00:10:22.003 { 00:10:22.003 "dma_device_id": "system", 00:10:22.003 "dma_device_type": 1 00:10:22.003 }, 00:10:22.003 { 00:10:22.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.003 "dma_device_type": 2 00:10:22.003 } 00:10:22.003 ], 00:10:22.003 "driver_specific": { 00:10:22.003 "passthru": { 00:10:22.003 "name": "pt1", 00:10:22.003 "base_bdev_name": "malloc1" 00:10:22.003 } 00:10:22.003 } 00:10:22.003 }' 00:10:22.003 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.003 11:46:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.003 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:22.003 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.003 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:22.262 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:22.521 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:22.521 "name": "pt2", 00:10:22.521 "aliases": [ 00:10:22.521 "132b3edc-f691-5dcc-8668-147815aa0a2c" 00:10:22.521 ], 00:10:22.521 "product_name": "passthru", 00:10:22.521 "block_size": 512, 00:10:22.521 "num_blocks": 65536, 00:10:22.521 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:22.521 "assigned_rate_limits": { 00:10:22.521 "rw_ios_per_sec": 0, 00:10:22.521 "rw_mbytes_per_sec": 0, 00:10:22.521 "r_mbytes_per_sec": 0, 00:10:22.521 "w_mbytes_per_sec": 0 00:10:22.521 }, 00:10:22.521 "claimed": true, 00:10:22.521 "claim_type": "exclusive_write", 00:10:22.521 "zoned": false, 00:10:22.521 "supported_io_types": { 00:10:22.521 "read": true, 00:10:22.521 "write": true, 00:10:22.521 "unmap": true, 00:10:22.521 "write_zeroes": true, 00:10:22.521 "flush": true, 00:10:22.521 "reset": true, 00:10:22.521 "compare": false, 00:10:22.521 "compare_and_write": false, 00:10:22.521 "abort": true, 00:10:22.521 "nvme_admin": false, 00:10:22.521 "nvme_io": false 00:10:22.521 }, 00:10:22.521 "memory_domains": [ 00:10:22.521 { 00:10:22.521 "dma_device_id": "system", 00:10:22.521 "dma_device_type": 1 00:10:22.521 }, 00:10:22.521 { 00:10:22.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.521 "dma_device_type": 2 00:10:22.521 } 00:10:22.521 ], 00:10:22.521 "driver_specific": { 00:10:22.521 "passthru": { 00:10:22.521 "name": "pt2", 00:10:22.521 "base_bdev_name": "malloc2" 00:10:22.521 } 00:10:22.521 } 00:10:22.521 }' 00:10:22.521 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.521 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.780 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:23.040 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:23.040 11:46:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:10:23.040 [2024-05-14 11:46:50.095841] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:23.040 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=245d5bc7-de96-4da7-abe2-fb134eb9a832 00:10:23.040 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 245d5bc7-de96-4da7-abe2-fb134eb9a832 ']' 00:10:23.040 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:23.299 [2024-05-14 11:46:50.340253] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:23.299 [2024-05-14 11:46:50.340281] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:23.299 [2024-05-14 11:46:50.340345] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:23.299 [2024-05-14 11:46:50.340390] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:23.299 [2024-05-14 11:46:50.340411] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ee92c0 name raid_bdev1, state offline 00:10:23.299 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.299 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:10:23.559 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:10:23.559 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:10:23.559 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:23.559 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:23.818 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:23.818 11:46:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:24.076 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:24.076 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:24.334 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:10:24.334 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:24.334 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:24.335 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:24.594 [2024-05-14 11:46:51.543384] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:24.594 [2024-05-14 11:46:51.544762] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:24.594 [2024-05-14 11:46:51.544828] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:24.594 [2024-05-14 11:46:51.544871] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:24.594 [2024-05-14 11:46:51.544890] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:24.594 [2024-05-14 11:46:51.544900] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ee9000 name raid_bdev1, state configuring 00:10:24.594 request: 00:10:24.594 { 00:10:24.594 "name": "raid_bdev1", 00:10:24.594 "raid_level": "raid0", 00:10:24.594 "base_bdevs": [ 00:10:24.594 "malloc1", 00:10:24.594 "malloc2" 00:10:24.594 ], 00:10:24.594 "superblock": false, 00:10:24.594 "strip_size_kb": 64, 00:10:24.594 "method": "bdev_raid_create", 00:10:24.594 "req_id": 1 00:10:24.594 } 00:10:24.594 Got JSON-RPC error response 00:10:24.594 response: 00:10:24.594 { 00:10:24.594 "code": -17, 00:10:24.594 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:24.594 } 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.594 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:10:24.853 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:10:24.853 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:10:24.853 11:46:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:25.115 [2024-05-14 11:46:52.024592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:25.115 [2024-05-14 11:46:52.024645] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.115 [2024-05-14 11:46:52.024671] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d40c60 00:10:25.115 [2024-05-14 11:46:52.024685] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.115 [2024-05-14 11:46:52.026369] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.115 [2024-05-14 11:46:52.026407] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:25.115 [2024-05-14 11:46:52.026500] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:10:25.115 [2024-05-14 11:46:52.026530] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:25.115 pt1 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.115 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:25.374 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:25.374 "name": "raid_bdev1", 00:10:25.374 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:25.374 "strip_size_kb": 64, 00:10:25.374 "state": "configuring", 00:10:25.374 "raid_level": "raid0", 00:10:25.374 "superblock": true, 00:10:25.374 "num_base_bdevs": 2, 00:10:25.374 "num_base_bdevs_discovered": 1, 00:10:25.374 "num_base_bdevs_operational": 2, 00:10:25.374 "base_bdevs_list": [ 00:10:25.374 { 00:10:25.374 "name": "pt1", 00:10:25.374 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:25.374 "is_configured": true, 00:10:25.374 "data_offset": 2048, 00:10:25.374 "data_size": 63488 00:10:25.374 }, 00:10:25.374 { 00:10:25.374 "name": null, 00:10:25.374 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:25.374 "is_configured": false, 00:10:25.374 "data_offset": 2048, 00:10:25.374 "data_size": 63488 00:10:25.374 } 00:10:25.374 ] 00:10:25.374 }' 00:10:25.374 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:25.374 11:46:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.942 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:10:25.942 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:10:25.942 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:25.942 11:46:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:26.201 [2024-05-14 11:46:53.103504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:26.201 [2024-05-14 11:46:53.103561] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:26.201 [2024-05-14 11:46:53.103582] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d3a380 00:10:26.201 [2024-05-14 11:46:53.103595] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:26.201 [2024-05-14 11:46:53.103957] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:26.201 [2024-05-14 11:46:53.103974] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:26.201 [2024-05-14 11:46:53.104043] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:10:26.201 [2024-05-14 11:46:53.104063] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:26.201 [2024-05-14 11:46:53.104160] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d3a900 00:10:26.201 [2024-05-14 11:46:53.104171] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:26.201 [2024-05-14 11:46:53.104344] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d3fc00 00:10:26.201 [2024-05-14 11:46:53.104483] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d3a900 00:10:26.201 [2024-05-14 11:46:53.104494] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d3a900 00:10:26.201 [2024-05-14 11:46:53.104603] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:26.201 pt2 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:26.201 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:26.459 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:26.460 "name": "raid_bdev1", 00:10:26.460 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:26.460 "strip_size_kb": 64, 00:10:26.460 "state": "online", 00:10:26.460 "raid_level": "raid0", 00:10:26.460 "superblock": true, 00:10:26.460 "num_base_bdevs": 2, 00:10:26.460 "num_base_bdevs_discovered": 2, 00:10:26.460 "num_base_bdevs_operational": 2, 00:10:26.460 "base_bdevs_list": [ 00:10:26.460 { 00:10:26.460 "name": "pt1", 00:10:26.460 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:26.460 "is_configured": true, 00:10:26.460 "data_offset": 2048, 00:10:26.460 "data_size": 63488 00:10:26.460 }, 00:10:26.460 { 00:10:26.460 "name": "pt2", 00:10:26.460 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:26.460 "is_configured": true, 00:10:26.460 "data_offset": 2048, 00:10:26.460 "data_size": 63488 00:10:26.460 } 00:10:26.460 ] 00:10:26.460 }' 00:10:26.460 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:26.460 11:46:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:27.027 11:46:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:27.287 [2024-05-14 11:46:54.114411] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:27.287 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:27.287 "name": "raid_bdev1", 00:10:27.287 "aliases": [ 00:10:27.287 "245d5bc7-de96-4da7-abe2-fb134eb9a832" 00:10:27.287 ], 00:10:27.287 "product_name": "Raid Volume", 00:10:27.287 "block_size": 512, 00:10:27.287 "num_blocks": 126976, 00:10:27.287 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:27.287 "assigned_rate_limits": { 00:10:27.287 "rw_ios_per_sec": 0, 00:10:27.287 "rw_mbytes_per_sec": 0, 00:10:27.287 "r_mbytes_per_sec": 0, 00:10:27.287 "w_mbytes_per_sec": 0 00:10:27.287 }, 00:10:27.287 "claimed": false, 00:10:27.287 "zoned": false, 00:10:27.287 "supported_io_types": { 00:10:27.287 "read": true, 00:10:27.287 "write": true, 00:10:27.287 "unmap": true, 00:10:27.287 "write_zeroes": true, 00:10:27.287 "flush": true, 00:10:27.287 "reset": true, 00:10:27.287 "compare": false, 00:10:27.287 "compare_and_write": false, 00:10:27.287 "abort": false, 00:10:27.287 "nvme_admin": false, 00:10:27.287 "nvme_io": false 00:10:27.287 }, 00:10:27.287 "memory_domains": [ 00:10:27.287 { 00:10:27.287 "dma_device_id": "system", 00:10:27.287 "dma_device_type": 1 00:10:27.287 }, 00:10:27.287 { 00:10:27.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.287 "dma_device_type": 2 00:10:27.287 }, 00:10:27.287 { 00:10:27.287 "dma_device_id": "system", 00:10:27.287 "dma_device_type": 1 00:10:27.287 }, 00:10:27.287 { 00:10:27.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.287 "dma_device_type": 2 00:10:27.287 } 00:10:27.287 ], 00:10:27.287 "driver_specific": { 00:10:27.287 "raid": { 00:10:27.287 "uuid": "245d5bc7-de96-4da7-abe2-fb134eb9a832", 00:10:27.287 "strip_size_kb": 64, 00:10:27.287 "state": "online", 00:10:27.287 "raid_level": "raid0", 00:10:27.287 "superblock": true, 00:10:27.288 "num_base_bdevs": 2, 00:10:27.288 "num_base_bdevs_discovered": 2, 00:10:27.288 "num_base_bdevs_operational": 2, 00:10:27.288 "base_bdevs_list": [ 00:10:27.288 { 00:10:27.288 "name": "pt1", 00:10:27.288 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:27.288 "is_configured": true, 00:10:27.288 "data_offset": 2048, 00:10:27.288 "data_size": 63488 00:10:27.288 }, 00:10:27.288 { 00:10:27.288 "name": "pt2", 00:10:27.288 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:27.288 "is_configured": true, 00:10:27.288 "data_offset": 2048, 00:10:27.288 "data_size": 63488 00:10:27.288 } 00:10:27.288 ] 00:10:27.288 } 00:10:27.288 } 00:10:27.288 }' 00:10:27.288 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:27.288 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:27.288 pt2' 00:10:27.288 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:27.288 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:27.288 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:27.547 "name": "pt1", 00:10:27.547 "aliases": [ 00:10:27.547 "01c7e21d-c8bf-5a52-97d8-68f8a94ae873" 00:10:27.547 ], 00:10:27.547 "product_name": "passthru", 00:10:27.547 "block_size": 512, 00:10:27.547 "num_blocks": 65536, 00:10:27.547 "uuid": "01c7e21d-c8bf-5a52-97d8-68f8a94ae873", 00:10:27.547 "assigned_rate_limits": { 00:10:27.547 "rw_ios_per_sec": 0, 00:10:27.547 "rw_mbytes_per_sec": 0, 00:10:27.547 "r_mbytes_per_sec": 0, 00:10:27.547 "w_mbytes_per_sec": 0 00:10:27.547 }, 00:10:27.547 "claimed": true, 00:10:27.547 "claim_type": "exclusive_write", 00:10:27.547 "zoned": false, 00:10:27.547 "supported_io_types": { 00:10:27.547 "read": true, 00:10:27.547 "write": true, 00:10:27.547 "unmap": true, 00:10:27.547 "write_zeroes": true, 00:10:27.547 "flush": true, 00:10:27.547 "reset": true, 00:10:27.547 "compare": false, 00:10:27.547 "compare_and_write": false, 00:10:27.547 "abort": true, 00:10:27.547 "nvme_admin": false, 00:10:27.547 "nvme_io": false 00:10:27.547 }, 00:10:27.547 "memory_domains": [ 00:10:27.547 { 00:10:27.547 "dma_device_id": "system", 00:10:27.547 "dma_device_type": 1 00:10:27.547 }, 00:10:27.547 { 00:10:27.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.547 "dma_device_type": 2 00:10:27.547 } 00:10:27.547 ], 00:10:27.547 "driver_specific": { 00:10:27.547 "passthru": { 00:10:27.547 "name": "pt1", 00:10:27.547 "base_bdev_name": "malloc1" 00:10:27.547 } 00:10:27.547 } 00:10:27.547 }' 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:27.547 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:27.806 11:46:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:28.065 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:28.065 "name": "pt2", 00:10:28.065 "aliases": [ 00:10:28.065 "132b3edc-f691-5dcc-8668-147815aa0a2c" 00:10:28.065 ], 00:10:28.065 "product_name": "passthru", 00:10:28.065 "block_size": 512, 00:10:28.065 "num_blocks": 65536, 00:10:28.065 "uuid": "132b3edc-f691-5dcc-8668-147815aa0a2c", 00:10:28.065 "assigned_rate_limits": { 00:10:28.065 "rw_ios_per_sec": 0, 00:10:28.065 "rw_mbytes_per_sec": 0, 00:10:28.065 "r_mbytes_per_sec": 0, 00:10:28.065 "w_mbytes_per_sec": 0 00:10:28.065 }, 00:10:28.065 "claimed": true, 00:10:28.065 "claim_type": "exclusive_write", 00:10:28.065 "zoned": false, 00:10:28.065 "supported_io_types": { 00:10:28.065 "read": true, 00:10:28.065 "write": true, 00:10:28.065 "unmap": true, 00:10:28.065 "write_zeroes": true, 00:10:28.065 "flush": true, 00:10:28.065 "reset": true, 00:10:28.065 "compare": false, 00:10:28.065 "compare_and_write": false, 00:10:28.065 "abort": true, 00:10:28.065 "nvme_admin": false, 00:10:28.065 "nvme_io": false 00:10:28.065 }, 00:10:28.065 "memory_domains": [ 00:10:28.065 { 00:10:28.065 "dma_device_id": "system", 00:10:28.065 "dma_device_type": 1 00:10:28.065 }, 00:10:28.065 { 00:10:28.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:28.065 "dma_device_type": 2 00:10:28.065 } 00:10:28.065 ], 00:10:28.065 "driver_specific": { 00:10:28.065 "passthru": { 00:10:28.065 "name": "pt2", 00:10:28.065 "base_bdev_name": "malloc2" 00:10:28.065 } 00:10:28.065 } 00:10:28.065 }' 00:10:28.065 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:28.065 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:28.065 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:28.065 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:28.324 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:10:28.583 [2024-05-14 11:46:55.582288] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 245d5bc7-de96-4da7-abe2-fb134eb9a832 '!=' 245d5bc7-de96-4da7-abe2-fb134eb9a832 ']' 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1666781 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1666781 ']' 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1666781 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1666781 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1666781' 00:10:28.583 killing process with pid 1666781 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1666781 00:10:28.583 [2024-05-14 11:46:55.653709] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:28.583 [2024-05-14 11:46:55.653777] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:28.583 [2024-05-14 11:46:55.653818] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:28.583 [2024-05-14 11:46:55.653830] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d3a900 name raid_bdev1, state offline 00:10:28.583 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1666781 00:10:28.841 [2024-05-14 11:46:55.670227] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:28.841 11:46:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:10:28.841 00:10:28.841 real 0m10.518s 00:10:28.841 user 0m18.726s 00:10:28.841 sys 0m2.013s 00:10:28.841 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:28.841 11:46:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.841 ************************************ 00:10:28.841 END TEST raid_superblock_test 00:10:28.841 ************************************ 00:10:28.841 11:46:55 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:10:28.841 11:46:55 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:28.841 11:46:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:28.841 11:46:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:28.841 11:46:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:29.100 ************************************ 00:10:29.100 START TEST raid_state_function_test 00:10:29.100 ************************************ 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 false 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:29.100 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1668410 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1668410' 00:10:29.101 Process raid pid: 1668410 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1668410 /var/tmp/spdk-raid.sock 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1668410 ']' 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:29.101 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:29.101 11:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.101 [2024-05-14 11:46:56.023668] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:10:29.101 [2024-05-14 11:46:56.023729] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:29.101 [2024-05-14 11:46:56.162446] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.359 [2024-05-14 11:46:56.269783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.359 [2024-05-14 11:46:56.340077] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.359 [2024-05-14 11:46:56.340113] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.927 11:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:29.927 11:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:10:29.927 11:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.187 [2024-05-14 11:46:57.167620] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.187 [2024-05-14 11:46:57.167662] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.187 [2024-05-14 11:46:57.167674] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.187 [2024-05-14 11:46:57.167685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.187 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.446 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:30.446 "name": "Existed_Raid", 00:10:30.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.446 "strip_size_kb": 64, 00:10:30.446 "state": "configuring", 00:10:30.446 "raid_level": "concat", 00:10:30.446 "superblock": false, 00:10:30.446 "num_base_bdevs": 2, 00:10:30.446 "num_base_bdevs_discovered": 0, 00:10:30.446 "num_base_bdevs_operational": 2, 00:10:30.446 "base_bdevs_list": [ 00:10:30.446 { 00:10:30.446 "name": "BaseBdev1", 00:10:30.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.446 "is_configured": false, 00:10:30.446 "data_offset": 0, 00:10:30.446 "data_size": 0 00:10:30.446 }, 00:10:30.446 { 00:10:30.446 "name": "BaseBdev2", 00:10:30.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.446 "is_configured": false, 00:10:30.446 "data_offset": 0, 00:10:30.446 "data_size": 0 00:10:30.446 } 00:10:30.446 ] 00:10:30.446 }' 00:10:30.446 11:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:30.446 11:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.014 11:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:31.274 [2024-05-14 11:46:58.254371] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:31.274 [2024-05-14 11:46:58.254408] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22fb700 name Existed_Raid, state configuring 00:10:31.274 11:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:31.550 [2024-05-14 11:46:58.499017] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:31.550 [2024-05-14 11:46:58.499047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:31.550 [2024-05-14 11:46:58.499057] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:31.550 [2024-05-14 11:46:58.499069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:31.550 11:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.859 [2024-05-14 11:46:58.745494] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.859 BaseBdev1 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:31.859 11:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:32.118 11:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:32.377 [ 00:10:32.377 { 00:10:32.377 "name": "BaseBdev1", 00:10:32.377 "aliases": [ 00:10:32.377 "34ed0778-6fad-4849-a593-2bc2e44718a0" 00:10:32.377 ], 00:10:32.377 "product_name": "Malloc disk", 00:10:32.377 "block_size": 512, 00:10:32.377 "num_blocks": 65536, 00:10:32.377 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:32.377 "assigned_rate_limits": { 00:10:32.377 "rw_ios_per_sec": 0, 00:10:32.377 "rw_mbytes_per_sec": 0, 00:10:32.377 "r_mbytes_per_sec": 0, 00:10:32.377 "w_mbytes_per_sec": 0 00:10:32.377 }, 00:10:32.377 "claimed": true, 00:10:32.377 "claim_type": "exclusive_write", 00:10:32.377 "zoned": false, 00:10:32.377 "supported_io_types": { 00:10:32.377 "read": true, 00:10:32.377 "write": true, 00:10:32.377 "unmap": true, 00:10:32.377 "write_zeroes": true, 00:10:32.377 "flush": true, 00:10:32.377 "reset": true, 00:10:32.377 "compare": false, 00:10:32.377 "compare_and_write": false, 00:10:32.377 "abort": true, 00:10:32.377 "nvme_admin": false, 00:10:32.377 "nvme_io": false 00:10:32.377 }, 00:10:32.377 "memory_domains": [ 00:10:32.377 { 00:10:32.377 "dma_device_id": "system", 00:10:32.377 "dma_device_type": 1 00:10:32.377 }, 00:10:32.377 { 00:10:32.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:32.377 "dma_device_type": 2 00:10:32.377 } 00:10:32.377 ], 00:10:32.377 "driver_specific": {} 00:10:32.377 } 00:10:32.377 ] 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:32.377 "name": "Existed_Raid", 00:10:32.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.377 "strip_size_kb": 64, 00:10:32.377 "state": "configuring", 00:10:32.377 "raid_level": "concat", 00:10:32.377 "superblock": false, 00:10:32.377 "num_base_bdevs": 2, 00:10:32.377 "num_base_bdevs_discovered": 1, 00:10:32.377 "num_base_bdevs_operational": 2, 00:10:32.377 "base_bdevs_list": [ 00:10:32.377 { 00:10:32.377 "name": "BaseBdev1", 00:10:32.377 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:32.377 "is_configured": true, 00:10:32.377 "data_offset": 0, 00:10:32.377 "data_size": 65536 00:10:32.377 }, 00:10:32.377 { 00:10:32.377 "name": "BaseBdev2", 00:10:32.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.377 "is_configured": false, 00:10:32.377 "data_offset": 0, 00:10:32.377 "data_size": 0 00:10:32.377 } 00:10:32.377 ] 00:10:32.377 }' 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:32.377 11:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.945 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:33.204 [2024-05-14 11:47:00.237448] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:33.204 [2024-05-14 11:47:00.237490] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22fb9a0 name Existed_Raid, state configuring 00:10:33.204 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:33.462 [2024-05-14 11:47:00.482111] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:33.462 [2024-05-14 11:47:00.483601] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:33.462 [2024-05-14 11:47:00.483634] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.462 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:33.721 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:33.721 "name": "Existed_Raid", 00:10:33.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.721 "strip_size_kb": 64, 00:10:33.721 "state": "configuring", 00:10:33.721 "raid_level": "concat", 00:10:33.721 "superblock": false, 00:10:33.721 "num_base_bdevs": 2, 00:10:33.721 "num_base_bdevs_discovered": 1, 00:10:33.721 "num_base_bdevs_operational": 2, 00:10:33.721 "base_bdevs_list": [ 00:10:33.721 { 00:10:33.721 "name": "BaseBdev1", 00:10:33.721 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:33.721 "is_configured": true, 00:10:33.721 "data_offset": 0, 00:10:33.721 "data_size": 65536 00:10:33.721 }, 00:10:33.721 { 00:10:33.721 "name": "BaseBdev2", 00:10:33.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.721 "is_configured": false, 00:10:33.721 "data_offset": 0, 00:10:33.721 "data_size": 0 00:10:33.721 } 00:10:33.721 ] 00:10:33.721 }' 00:10:33.721 11:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:33.721 11:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.289 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:34.547 [2024-05-14 11:47:01.512378] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:34.547 [2024-05-14 11:47:01.512421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x22faff0 00:10:34.547 [2024-05-14 11:47:01.512430] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:34.547 [2024-05-14 11:47:01.512620] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22fd810 00:10:34.547 [2024-05-14 11:47:01.512739] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22faff0 00:10:34.547 [2024-05-14 11:47:01.512749] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22faff0 00:10:34.547 [2024-05-14 11:47:01.512912] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.547 BaseBdev2 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:34.547 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:34.806 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:35.064 [ 00:10:35.064 { 00:10:35.064 "name": "BaseBdev2", 00:10:35.064 "aliases": [ 00:10:35.064 "340aafca-a3e4-4853-8454-ec517eb0cb6d" 00:10:35.064 ], 00:10:35.064 "product_name": "Malloc disk", 00:10:35.064 "block_size": 512, 00:10:35.064 "num_blocks": 65536, 00:10:35.064 "uuid": "340aafca-a3e4-4853-8454-ec517eb0cb6d", 00:10:35.064 "assigned_rate_limits": { 00:10:35.064 "rw_ios_per_sec": 0, 00:10:35.064 "rw_mbytes_per_sec": 0, 00:10:35.064 "r_mbytes_per_sec": 0, 00:10:35.064 "w_mbytes_per_sec": 0 00:10:35.064 }, 00:10:35.064 "claimed": true, 00:10:35.064 "claim_type": "exclusive_write", 00:10:35.064 "zoned": false, 00:10:35.064 "supported_io_types": { 00:10:35.064 "read": true, 00:10:35.064 "write": true, 00:10:35.064 "unmap": true, 00:10:35.064 "write_zeroes": true, 00:10:35.064 "flush": true, 00:10:35.064 "reset": true, 00:10:35.064 "compare": false, 00:10:35.064 "compare_and_write": false, 00:10:35.064 "abort": true, 00:10:35.064 "nvme_admin": false, 00:10:35.064 "nvme_io": false 00:10:35.064 }, 00:10:35.064 "memory_domains": [ 00:10:35.064 { 00:10:35.064 "dma_device_id": "system", 00:10:35.064 "dma_device_type": 1 00:10:35.064 }, 00:10:35.064 { 00:10:35.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.064 "dma_device_type": 2 00:10:35.064 } 00:10:35.064 ], 00:10:35.064 "driver_specific": {} 00:10:35.064 } 00:10:35.064 ] 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.064 11:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.064 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:35.064 "name": "Existed_Raid", 00:10:35.064 "uuid": "be1027a5-5aa0-499b-85ae-f42701e20340", 00:10:35.064 "strip_size_kb": 64, 00:10:35.064 "state": "online", 00:10:35.064 "raid_level": "concat", 00:10:35.064 "superblock": false, 00:10:35.064 "num_base_bdevs": 2, 00:10:35.064 "num_base_bdevs_discovered": 2, 00:10:35.064 "num_base_bdevs_operational": 2, 00:10:35.064 "base_bdevs_list": [ 00:10:35.064 { 00:10:35.064 "name": "BaseBdev1", 00:10:35.064 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:35.064 "is_configured": true, 00:10:35.064 "data_offset": 0, 00:10:35.064 "data_size": 65536 00:10:35.064 }, 00:10:35.064 { 00:10:35.064 "name": "BaseBdev2", 00:10:35.064 "uuid": "340aafca-a3e4-4853-8454-ec517eb0cb6d", 00:10:35.064 "is_configured": true, 00:10:35.064 "data_offset": 0, 00:10:35.064 "data_size": 65536 00:10:35.064 } 00:10:35.064 ] 00:10:35.064 }' 00:10:35.064 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:35.064 11:47:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:35.632 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:35.891 [2024-05-14 11:47:02.876237] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:35.891 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:35.891 "name": "Existed_Raid", 00:10:35.891 "aliases": [ 00:10:35.891 "be1027a5-5aa0-499b-85ae-f42701e20340" 00:10:35.891 ], 00:10:35.891 "product_name": "Raid Volume", 00:10:35.891 "block_size": 512, 00:10:35.891 "num_blocks": 131072, 00:10:35.891 "uuid": "be1027a5-5aa0-499b-85ae-f42701e20340", 00:10:35.891 "assigned_rate_limits": { 00:10:35.891 "rw_ios_per_sec": 0, 00:10:35.891 "rw_mbytes_per_sec": 0, 00:10:35.891 "r_mbytes_per_sec": 0, 00:10:35.891 "w_mbytes_per_sec": 0 00:10:35.891 }, 00:10:35.891 "claimed": false, 00:10:35.891 "zoned": false, 00:10:35.891 "supported_io_types": { 00:10:35.891 "read": true, 00:10:35.891 "write": true, 00:10:35.891 "unmap": true, 00:10:35.891 "write_zeroes": true, 00:10:35.891 "flush": true, 00:10:35.891 "reset": true, 00:10:35.891 "compare": false, 00:10:35.891 "compare_and_write": false, 00:10:35.891 "abort": false, 00:10:35.891 "nvme_admin": false, 00:10:35.891 "nvme_io": false 00:10:35.891 }, 00:10:35.891 "memory_domains": [ 00:10:35.891 { 00:10:35.891 "dma_device_id": "system", 00:10:35.891 "dma_device_type": 1 00:10:35.891 }, 00:10:35.891 { 00:10:35.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.891 "dma_device_type": 2 00:10:35.891 }, 00:10:35.891 { 00:10:35.891 "dma_device_id": "system", 00:10:35.891 "dma_device_type": 1 00:10:35.891 }, 00:10:35.891 { 00:10:35.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.891 "dma_device_type": 2 00:10:35.891 } 00:10:35.891 ], 00:10:35.891 "driver_specific": { 00:10:35.891 "raid": { 00:10:35.891 "uuid": "be1027a5-5aa0-499b-85ae-f42701e20340", 00:10:35.891 "strip_size_kb": 64, 00:10:35.891 "state": "online", 00:10:35.891 "raid_level": "concat", 00:10:35.891 "superblock": false, 00:10:35.891 "num_base_bdevs": 2, 00:10:35.891 "num_base_bdevs_discovered": 2, 00:10:35.891 "num_base_bdevs_operational": 2, 00:10:35.891 "base_bdevs_list": [ 00:10:35.891 { 00:10:35.891 "name": "BaseBdev1", 00:10:35.891 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:35.891 "is_configured": true, 00:10:35.891 "data_offset": 0, 00:10:35.891 "data_size": 65536 00:10:35.891 }, 00:10:35.891 { 00:10:35.891 "name": "BaseBdev2", 00:10:35.891 "uuid": "340aafca-a3e4-4853-8454-ec517eb0cb6d", 00:10:35.891 "is_configured": true, 00:10:35.891 "data_offset": 0, 00:10:35.891 "data_size": 65536 00:10:35.891 } 00:10:35.891 ] 00:10:35.891 } 00:10:35.891 } 00:10:35.891 }' 00:10:35.891 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:35.891 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:35.891 BaseBdev2' 00:10:35.892 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:35.892 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:35.892 11:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:36.150 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:36.150 "name": "BaseBdev1", 00:10:36.150 "aliases": [ 00:10:36.150 "34ed0778-6fad-4849-a593-2bc2e44718a0" 00:10:36.150 ], 00:10:36.150 "product_name": "Malloc disk", 00:10:36.150 "block_size": 512, 00:10:36.150 "num_blocks": 65536, 00:10:36.150 "uuid": "34ed0778-6fad-4849-a593-2bc2e44718a0", 00:10:36.151 "assigned_rate_limits": { 00:10:36.151 "rw_ios_per_sec": 0, 00:10:36.151 "rw_mbytes_per_sec": 0, 00:10:36.151 "r_mbytes_per_sec": 0, 00:10:36.151 "w_mbytes_per_sec": 0 00:10:36.151 }, 00:10:36.151 "claimed": true, 00:10:36.151 "claim_type": "exclusive_write", 00:10:36.151 "zoned": false, 00:10:36.151 "supported_io_types": { 00:10:36.151 "read": true, 00:10:36.151 "write": true, 00:10:36.151 "unmap": true, 00:10:36.151 "write_zeroes": true, 00:10:36.151 "flush": true, 00:10:36.151 "reset": true, 00:10:36.151 "compare": false, 00:10:36.151 "compare_and_write": false, 00:10:36.151 "abort": true, 00:10:36.151 "nvme_admin": false, 00:10:36.151 "nvme_io": false 00:10:36.151 }, 00:10:36.151 "memory_domains": [ 00:10:36.151 { 00:10:36.151 "dma_device_id": "system", 00:10:36.151 "dma_device_type": 1 00:10:36.151 }, 00:10:36.151 { 00:10:36.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.151 "dma_device_type": 2 00:10:36.151 } 00:10:36.151 ], 00:10:36.151 "driver_specific": {} 00:10:36.151 }' 00:10:36.151 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.151 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:36.409 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:36.668 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:36.668 "name": "BaseBdev2", 00:10:36.668 "aliases": [ 00:10:36.668 "340aafca-a3e4-4853-8454-ec517eb0cb6d" 00:10:36.668 ], 00:10:36.668 "product_name": "Malloc disk", 00:10:36.668 "block_size": 512, 00:10:36.668 "num_blocks": 65536, 00:10:36.668 "uuid": "340aafca-a3e4-4853-8454-ec517eb0cb6d", 00:10:36.668 "assigned_rate_limits": { 00:10:36.668 "rw_ios_per_sec": 0, 00:10:36.668 "rw_mbytes_per_sec": 0, 00:10:36.668 "r_mbytes_per_sec": 0, 00:10:36.668 "w_mbytes_per_sec": 0 00:10:36.668 }, 00:10:36.668 "claimed": true, 00:10:36.668 "claim_type": "exclusive_write", 00:10:36.668 "zoned": false, 00:10:36.668 "supported_io_types": { 00:10:36.668 "read": true, 00:10:36.668 "write": true, 00:10:36.668 "unmap": true, 00:10:36.668 "write_zeroes": true, 00:10:36.668 "flush": true, 00:10:36.668 "reset": true, 00:10:36.668 "compare": false, 00:10:36.668 "compare_and_write": false, 00:10:36.668 "abort": true, 00:10:36.668 "nvme_admin": false, 00:10:36.668 "nvme_io": false 00:10:36.668 }, 00:10:36.668 "memory_domains": [ 00:10:36.668 { 00:10:36.668 "dma_device_id": "system", 00:10:36.668 "dma_device_type": 1 00:10:36.668 }, 00:10:36.668 { 00:10:36.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.668 "dma_device_type": 2 00:10:36.668 } 00:10:36.668 ], 00:10:36.668 "driver_specific": {} 00:10:36.668 }' 00:10:36.668 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.926 11:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:36.926 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:36.926 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:37.184 [2024-05-14 11:47:04.227656] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:37.184 [2024-05-14 11:47:04.227682] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:37.184 [2024-05-14 11:47:04.227724] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.184 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.442 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:37.442 "name": "Existed_Raid", 00:10:37.442 "uuid": "be1027a5-5aa0-499b-85ae-f42701e20340", 00:10:37.442 "strip_size_kb": 64, 00:10:37.442 "state": "offline", 00:10:37.442 "raid_level": "concat", 00:10:37.442 "superblock": false, 00:10:37.442 "num_base_bdevs": 2, 00:10:37.442 "num_base_bdevs_discovered": 1, 00:10:37.442 "num_base_bdevs_operational": 1, 00:10:37.442 "base_bdevs_list": [ 00:10:37.442 { 00:10:37.442 "name": null, 00:10:37.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.442 "is_configured": false, 00:10:37.442 "data_offset": 0, 00:10:37.442 "data_size": 65536 00:10:37.442 }, 00:10:37.442 { 00:10:37.442 "name": "BaseBdev2", 00:10:37.442 "uuid": "340aafca-a3e4-4853-8454-ec517eb0cb6d", 00:10:37.442 "is_configured": true, 00:10:37.442 "data_offset": 0, 00:10:37.442 "data_size": 65536 00:10:37.442 } 00:10:37.442 ] 00:10:37.442 }' 00:10:37.442 11:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:37.442 11:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.009 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:38.009 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:38.009 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.009 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:38.269 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:38.269 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:38.269 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:38.528 [2024-05-14 11:47:05.568219] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:38.528 [2024-05-14 11:47:05.568267] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22faff0 name Existed_Raid, state offline 00:10:38.528 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:38.528 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:38.528 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.528 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1668410 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1668410 ']' 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1668410 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:38.787 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1668410 00:10:39.047 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:39.047 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:39.047 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1668410' 00:10:39.047 killing process with pid 1668410 00:10:39.047 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1668410 00:10:39.047 [2024-05-14 11:47:05.902290] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:39.047 11:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1668410 00:10:39.047 [2024-05-14 11:47:05.903214] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:39.047 11:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:10:39.047 00:10:39.047 real 0m10.166s 00:10:39.047 user 0m17.997s 00:10:39.047 sys 0m1.912s 00:10:39.047 11:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:39.047 11:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.047 ************************************ 00:10:39.047 END TEST raid_state_function_test 00:10:39.047 ************************************ 00:10:39.306 11:47:06 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:10:39.306 11:47:06 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:39.306 11:47:06 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:39.306 11:47:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:39.306 ************************************ 00:10:39.306 START TEST raid_state_function_test_sb 00:10:39.306 ************************************ 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 true 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1670041 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1670041' 00:10:39.306 Process raid pid: 1670041 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:39.306 11:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1670041 /var/tmp/spdk-raid.sock 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1670041 ']' 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:39.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:39.307 11:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:39.307 [2024-05-14 11:47:06.272594] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:10:39.307 [2024-05-14 11:47:06.272656] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:39.585 [2024-05-14 11:47:06.401963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:39.585 [2024-05-14 11:47:06.508135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:39.585 [2024-05-14 11:47:06.574843] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:39.585 [2024-05-14 11:47:06.574874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.153 11:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:40.153 11:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:10:40.153 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:40.412 [2024-05-14 11:47:07.406228] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:40.412 [2024-05-14 11:47:07.406270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:40.412 [2024-05-14 11:47:07.406281] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:40.412 [2024-05-14 11:47:07.406293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.412 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:40.671 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:40.671 "name": "Existed_Raid", 00:10:40.671 "uuid": "ce87459c-118c-4076-a5ae-ead4f17f99db", 00:10:40.671 "strip_size_kb": 64, 00:10:40.671 "state": "configuring", 00:10:40.671 "raid_level": "concat", 00:10:40.671 "superblock": true, 00:10:40.671 "num_base_bdevs": 2, 00:10:40.671 "num_base_bdevs_discovered": 0, 00:10:40.671 "num_base_bdevs_operational": 2, 00:10:40.671 "base_bdevs_list": [ 00:10:40.671 { 00:10:40.671 "name": "BaseBdev1", 00:10:40.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.671 "is_configured": false, 00:10:40.671 "data_offset": 0, 00:10:40.671 "data_size": 0 00:10:40.671 }, 00:10:40.671 { 00:10:40.671 "name": "BaseBdev2", 00:10:40.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:40.671 "is_configured": false, 00:10:40.671 "data_offset": 0, 00:10:40.671 "data_size": 0 00:10:40.671 } 00:10:40.671 ] 00:10:40.671 }' 00:10:40.671 11:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:40.671 11:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:41.238 11:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:41.497 [2024-05-14 11:47:08.368636] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:41.497 [2024-05-14 11:47:08.368665] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188d700 name Existed_Raid, state configuring 00:10:41.497 11:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:41.755 [2024-05-14 11:47:08.609303] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:41.755 [2024-05-14 11:47:08.609333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:41.755 [2024-05-14 11:47:08.609344] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:41.755 [2024-05-14 11:47:08.609355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:41.755 11:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:41.755 [2024-05-14 11:47:08.799603] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:41.755 BaseBdev1 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:41.756 11:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:42.014 11:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:42.272 [ 00:10:42.272 { 00:10:42.272 "name": "BaseBdev1", 00:10:42.272 "aliases": [ 00:10:42.272 "6703022b-3ef2-48d5-9198-7f339bf01371" 00:10:42.272 ], 00:10:42.272 "product_name": "Malloc disk", 00:10:42.272 "block_size": 512, 00:10:42.272 "num_blocks": 65536, 00:10:42.272 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:42.272 "assigned_rate_limits": { 00:10:42.272 "rw_ios_per_sec": 0, 00:10:42.272 "rw_mbytes_per_sec": 0, 00:10:42.272 "r_mbytes_per_sec": 0, 00:10:42.272 "w_mbytes_per_sec": 0 00:10:42.272 }, 00:10:42.272 "claimed": true, 00:10:42.272 "claim_type": "exclusive_write", 00:10:42.272 "zoned": false, 00:10:42.272 "supported_io_types": { 00:10:42.272 "read": true, 00:10:42.272 "write": true, 00:10:42.272 "unmap": true, 00:10:42.272 "write_zeroes": true, 00:10:42.272 "flush": true, 00:10:42.272 "reset": true, 00:10:42.272 "compare": false, 00:10:42.272 "compare_and_write": false, 00:10:42.272 "abort": true, 00:10:42.272 "nvme_admin": false, 00:10:42.272 "nvme_io": false 00:10:42.272 }, 00:10:42.272 "memory_domains": [ 00:10:42.272 { 00:10:42.272 "dma_device_id": "system", 00:10:42.272 "dma_device_type": 1 00:10:42.272 }, 00:10:42.272 { 00:10:42.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.272 "dma_device_type": 2 00:10:42.272 } 00:10:42.272 ], 00:10:42.272 "driver_specific": {} 00:10:42.272 } 00:10:42.272 ] 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:42.272 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:42.273 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.273 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:42.531 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:42.531 "name": "Existed_Raid", 00:10:42.531 "uuid": "73aef41f-b169-4939-9e4f-7426fe2b3912", 00:10:42.531 "strip_size_kb": 64, 00:10:42.531 "state": "configuring", 00:10:42.531 "raid_level": "concat", 00:10:42.531 "superblock": true, 00:10:42.531 "num_base_bdevs": 2, 00:10:42.531 "num_base_bdevs_discovered": 1, 00:10:42.531 "num_base_bdevs_operational": 2, 00:10:42.531 "base_bdevs_list": [ 00:10:42.531 { 00:10:42.531 "name": "BaseBdev1", 00:10:42.531 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:42.531 "is_configured": true, 00:10:42.531 "data_offset": 2048, 00:10:42.531 "data_size": 63488 00:10:42.531 }, 00:10:42.531 { 00:10:42.531 "name": "BaseBdev2", 00:10:42.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:42.531 "is_configured": false, 00:10:42.531 "data_offset": 0, 00:10:42.531 "data_size": 0 00:10:42.531 } 00:10:42.531 ] 00:10:42.531 }' 00:10:42.531 11:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:42.531 11:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:43.097 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:43.355 [2024-05-14 11:47:10.335669] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:43.355 [2024-05-14 11:47:10.335714] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188d9a0 name Existed_Raid, state configuring 00:10:43.355 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:43.614 [2024-05-14 11:47:10.580352] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:43.614 [2024-05-14 11:47:10.581840] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:43.614 [2024-05-14 11:47:10.581871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.614 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.873 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:43.873 "name": "Existed_Raid", 00:10:43.873 "uuid": "37cc37fa-413c-4373-917f-20a01fafd43d", 00:10:43.873 "strip_size_kb": 64, 00:10:43.873 "state": "configuring", 00:10:43.873 "raid_level": "concat", 00:10:43.873 "superblock": true, 00:10:43.873 "num_base_bdevs": 2, 00:10:43.873 "num_base_bdevs_discovered": 1, 00:10:43.873 "num_base_bdevs_operational": 2, 00:10:43.873 "base_bdevs_list": [ 00:10:43.873 { 00:10:43.873 "name": "BaseBdev1", 00:10:43.873 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:43.873 "is_configured": true, 00:10:43.873 "data_offset": 2048, 00:10:43.873 "data_size": 63488 00:10:43.873 }, 00:10:43.873 { 00:10:43.873 "name": "BaseBdev2", 00:10:43.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.873 "is_configured": false, 00:10:43.873 "data_offset": 0, 00:10:43.873 "data_size": 0 00:10:43.873 } 00:10:43.873 ] 00:10:43.873 }' 00:10:43.873 11:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:43.873 11:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:44.439 11:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:44.697 [2024-05-14 11:47:11.682728] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:44.697 [2024-05-14 11:47:11.682880] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x188cff0 00:10:44.697 [2024-05-14 11:47:11.682894] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:44.697 [2024-05-14 11:47:11.683064] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x188f810 00:10:44.697 [2024-05-14 11:47:11.683175] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x188cff0 00:10:44.697 [2024-05-14 11:47:11.683185] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x188cff0 00:10:44.697 [2024-05-14 11:47:11.683273] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:44.697 BaseBdev2 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:44.697 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:44.955 11:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:45.214 [ 00:10:45.214 { 00:10:45.214 "name": "BaseBdev2", 00:10:45.214 "aliases": [ 00:10:45.214 "053e586d-7ab2-45a2-9759-f60bef13b35d" 00:10:45.214 ], 00:10:45.214 "product_name": "Malloc disk", 00:10:45.214 "block_size": 512, 00:10:45.214 "num_blocks": 65536, 00:10:45.214 "uuid": "053e586d-7ab2-45a2-9759-f60bef13b35d", 00:10:45.214 "assigned_rate_limits": { 00:10:45.214 "rw_ios_per_sec": 0, 00:10:45.214 "rw_mbytes_per_sec": 0, 00:10:45.214 "r_mbytes_per_sec": 0, 00:10:45.214 "w_mbytes_per_sec": 0 00:10:45.214 }, 00:10:45.214 "claimed": true, 00:10:45.214 "claim_type": "exclusive_write", 00:10:45.214 "zoned": false, 00:10:45.214 "supported_io_types": { 00:10:45.214 "read": true, 00:10:45.214 "write": true, 00:10:45.214 "unmap": true, 00:10:45.214 "write_zeroes": true, 00:10:45.214 "flush": true, 00:10:45.214 "reset": true, 00:10:45.214 "compare": false, 00:10:45.214 "compare_and_write": false, 00:10:45.214 "abort": true, 00:10:45.214 "nvme_admin": false, 00:10:45.214 "nvme_io": false 00:10:45.214 }, 00:10:45.214 "memory_domains": [ 00:10:45.214 { 00:10:45.214 "dma_device_id": "system", 00:10:45.214 "dma_device_type": 1 00:10:45.214 }, 00:10:45.214 { 00:10:45.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.214 "dma_device_type": 2 00:10:45.214 } 00:10:45.214 ], 00:10:45.214 "driver_specific": {} 00:10:45.214 } 00:10:45.214 ] 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.214 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.473 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:45.473 "name": "Existed_Raid", 00:10:45.473 "uuid": "37cc37fa-413c-4373-917f-20a01fafd43d", 00:10:45.473 "strip_size_kb": 64, 00:10:45.473 "state": "online", 00:10:45.473 "raid_level": "concat", 00:10:45.473 "superblock": true, 00:10:45.473 "num_base_bdevs": 2, 00:10:45.473 "num_base_bdevs_discovered": 2, 00:10:45.473 "num_base_bdevs_operational": 2, 00:10:45.473 "base_bdevs_list": [ 00:10:45.473 { 00:10:45.473 "name": "BaseBdev1", 00:10:45.473 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:45.473 "is_configured": true, 00:10:45.473 "data_offset": 2048, 00:10:45.473 "data_size": 63488 00:10:45.473 }, 00:10:45.473 { 00:10:45.473 "name": "BaseBdev2", 00:10:45.473 "uuid": "053e586d-7ab2-45a2-9759-f60bef13b35d", 00:10:45.473 "is_configured": true, 00:10:45.473 "data_offset": 2048, 00:10:45.473 "data_size": 63488 00:10:45.473 } 00:10:45.473 ] 00:10:45.473 }' 00:10:45.473 11:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:45.473 11:47:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:46.040 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:46.299 [2024-05-14 11:47:13.235075] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:46.299 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:46.299 "name": "Existed_Raid", 00:10:46.299 "aliases": [ 00:10:46.299 "37cc37fa-413c-4373-917f-20a01fafd43d" 00:10:46.299 ], 00:10:46.299 "product_name": "Raid Volume", 00:10:46.299 "block_size": 512, 00:10:46.299 "num_blocks": 126976, 00:10:46.299 "uuid": "37cc37fa-413c-4373-917f-20a01fafd43d", 00:10:46.299 "assigned_rate_limits": { 00:10:46.299 "rw_ios_per_sec": 0, 00:10:46.299 "rw_mbytes_per_sec": 0, 00:10:46.299 "r_mbytes_per_sec": 0, 00:10:46.299 "w_mbytes_per_sec": 0 00:10:46.299 }, 00:10:46.299 "claimed": false, 00:10:46.299 "zoned": false, 00:10:46.299 "supported_io_types": { 00:10:46.299 "read": true, 00:10:46.299 "write": true, 00:10:46.299 "unmap": true, 00:10:46.299 "write_zeroes": true, 00:10:46.299 "flush": true, 00:10:46.299 "reset": true, 00:10:46.299 "compare": false, 00:10:46.299 "compare_and_write": false, 00:10:46.299 "abort": false, 00:10:46.299 "nvme_admin": false, 00:10:46.299 "nvme_io": false 00:10:46.299 }, 00:10:46.299 "memory_domains": [ 00:10:46.299 { 00:10:46.299 "dma_device_id": "system", 00:10:46.299 "dma_device_type": 1 00:10:46.299 }, 00:10:46.299 { 00:10:46.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.299 "dma_device_type": 2 00:10:46.299 }, 00:10:46.299 { 00:10:46.299 "dma_device_id": "system", 00:10:46.299 "dma_device_type": 1 00:10:46.299 }, 00:10:46.299 { 00:10:46.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.299 "dma_device_type": 2 00:10:46.299 } 00:10:46.299 ], 00:10:46.299 "driver_specific": { 00:10:46.300 "raid": { 00:10:46.300 "uuid": "37cc37fa-413c-4373-917f-20a01fafd43d", 00:10:46.300 "strip_size_kb": 64, 00:10:46.300 "state": "online", 00:10:46.300 "raid_level": "concat", 00:10:46.300 "superblock": true, 00:10:46.300 "num_base_bdevs": 2, 00:10:46.300 "num_base_bdevs_discovered": 2, 00:10:46.300 "num_base_bdevs_operational": 2, 00:10:46.300 "base_bdevs_list": [ 00:10:46.300 { 00:10:46.300 "name": "BaseBdev1", 00:10:46.300 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:46.300 "is_configured": true, 00:10:46.300 "data_offset": 2048, 00:10:46.300 "data_size": 63488 00:10:46.300 }, 00:10:46.300 { 00:10:46.300 "name": "BaseBdev2", 00:10:46.300 "uuid": "053e586d-7ab2-45a2-9759-f60bef13b35d", 00:10:46.300 "is_configured": true, 00:10:46.300 "data_offset": 2048, 00:10:46.300 "data_size": 63488 00:10:46.300 } 00:10:46.300 ] 00:10:46.300 } 00:10:46.300 } 00:10:46.300 }' 00:10:46.300 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:46.300 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:46.300 BaseBdev2' 00:10:46.300 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:46.300 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:46.300 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:46.559 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:46.559 "name": "BaseBdev1", 00:10:46.559 "aliases": [ 00:10:46.559 "6703022b-3ef2-48d5-9198-7f339bf01371" 00:10:46.559 ], 00:10:46.559 "product_name": "Malloc disk", 00:10:46.559 "block_size": 512, 00:10:46.559 "num_blocks": 65536, 00:10:46.559 "uuid": "6703022b-3ef2-48d5-9198-7f339bf01371", 00:10:46.559 "assigned_rate_limits": { 00:10:46.559 "rw_ios_per_sec": 0, 00:10:46.559 "rw_mbytes_per_sec": 0, 00:10:46.559 "r_mbytes_per_sec": 0, 00:10:46.559 "w_mbytes_per_sec": 0 00:10:46.559 }, 00:10:46.559 "claimed": true, 00:10:46.559 "claim_type": "exclusive_write", 00:10:46.559 "zoned": false, 00:10:46.559 "supported_io_types": { 00:10:46.559 "read": true, 00:10:46.559 "write": true, 00:10:46.559 "unmap": true, 00:10:46.559 "write_zeroes": true, 00:10:46.559 "flush": true, 00:10:46.559 "reset": true, 00:10:46.559 "compare": false, 00:10:46.559 "compare_and_write": false, 00:10:46.559 "abort": true, 00:10:46.559 "nvme_admin": false, 00:10:46.559 "nvme_io": false 00:10:46.559 }, 00:10:46.559 "memory_domains": [ 00:10:46.559 { 00:10:46.559 "dma_device_id": "system", 00:10:46.559 "dma_device_type": 1 00:10:46.559 }, 00:10:46.559 { 00:10:46.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.559 "dma_device_type": 2 00:10:46.559 } 00:10:46.559 ], 00:10:46.559 "driver_specific": {} 00:10:46.559 }' 00:10:46.559 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:46.559 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:46.559 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:46.559 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:46.818 11:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:47.076 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:47.076 "name": "BaseBdev2", 00:10:47.076 "aliases": [ 00:10:47.076 "053e586d-7ab2-45a2-9759-f60bef13b35d" 00:10:47.076 ], 00:10:47.076 "product_name": "Malloc disk", 00:10:47.076 "block_size": 512, 00:10:47.076 "num_blocks": 65536, 00:10:47.076 "uuid": "053e586d-7ab2-45a2-9759-f60bef13b35d", 00:10:47.076 "assigned_rate_limits": { 00:10:47.076 "rw_ios_per_sec": 0, 00:10:47.076 "rw_mbytes_per_sec": 0, 00:10:47.076 "r_mbytes_per_sec": 0, 00:10:47.076 "w_mbytes_per_sec": 0 00:10:47.076 }, 00:10:47.076 "claimed": true, 00:10:47.076 "claim_type": "exclusive_write", 00:10:47.076 "zoned": false, 00:10:47.076 "supported_io_types": { 00:10:47.076 "read": true, 00:10:47.076 "write": true, 00:10:47.076 "unmap": true, 00:10:47.076 "write_zeroes": true, 00:10:47.076 "flush": true, 00:10:47.076 "reset": true, 00:10:47.076 "compare": false, 00:10:47.076 "compare_and_write": false, 00:10:47.076 "abort": true, 00:10:47.076 "nvme_admin": false, 00:10:47.076 "nvme_io": false 00:10:47.076 }, 00:10:47.076 "memory_domains": [ 00:10:47.076 { 00:10:47.076 "dma_device_id": "system", 00:10:47.076 "dma_device_type": 1 00:10:47.076 }, 00:10:47.076 { 00:10:47.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.076 "dma_device_type": 2 00:10:47.076 } 00:10:47.076 ], 00:10:47.076 "driver_specific": {} 00:10:47.076 }' 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:47.335 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:47.595 [2024-05-14 11:47:14.622584] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:47.595 [2024-05-14 11:47:14.622608] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:47.595 [2024-05-14 11:47:14.622655] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.595 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.895 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:47.895 "name": "Existed_Raid", 00:10:47.895 "uuid": "37cc37fa-413c-4373-917f-20a01fafd43d", 00:10:47.895 "strip_size_kb": 64, 00:10:47.895 "state": "offline", 00:10:47.895 "raid_level": "concat", 00:10:47.895 "superblock": true, 00:10:47.895 "num_base_bdevs": 2, 00:10:47.895 "num_base_bdevs_discovered": 1, 00:10:47.895 "num_base_bdevs_operational": 1, 00:10:47.895 "base_bdevs_list": [ 00:10:47.895 { 00:10:47.895 "name": null, 00:10:47.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.895 "is_configured": false, 00:10:47.895 "data_offset": 2048, 00:10:47.895 "data_size": 63488 00:10:47.895 }, 00:10:47.895 { 00:10:47.895 "name": "BaseBdev2", 00:10:47.895 "uuid": "053e586d-7ab2-45a2-9759-f60bef13b35d", 00:10:47.895 "is_configured": true, 00:10:47.895 "data_offset": 2048, 00:10:47.895 "data_size": 63488 00:10:47.895 } 00:10:47.895 ] 00:10:47.895 }' 00:10:47.895 11:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:47.895 11:47:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:48.462 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:48.462 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:48.462 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.462 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:48.720 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:48.720 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:48.720 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:48.979 [2024-05-14 11:47:15.882913] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:48.979 [2024-05-14 11:47:15.882963] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188cff0 name Existed_Raid, state offline 00:10:48.979 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:48.979 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:48.979 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.979 11:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1670041 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1670041 ']' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1670041 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1670041 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1670041' 00:10:49.238 killing process with pid 1670041 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1670041 00:10:49.238 [2024-05-14 11:47:16.205296] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:49.238 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1670041 00:10:49.238 [2024-05-14 11:47:16.206166] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:49.497 11:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:10:49.497 00:10:49.497 real 0m10.208s 00:10:49.497 user 0m18.190s 00:10:49.497 sys 0m1.880s 00:10:49.497 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:49.497 11:47:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:49.497 ************************************ 00:10:49.497 END TEST raid_state_function_test_sb 00:10:49.497 ************************************ 00:10:49.497 11:47:16 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:49.497 11:47:16 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:10:49.497 11:47:16 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:49.497 11:47:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:49.497 ************************************ 00:10:49.497 START TEST raid_superblock_test 00:10:49.497 ************************************ 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 2 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1671539 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1671539 /var/tmp/spdk-raid.sock 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1671539 ']' 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:49.497 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:49.497 11:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.497 [2024-05-14 11:47:16.564919] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:10:49.497 [2024-05-14 11:47:16.564984] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1671539 ] 00:10:49.757 [2024-05-14 11:47:16.692419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:49.757 [2024-05-14 11:47:16.799168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:50.015 [2024-05-14 11:47:16.869493] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.015 [2024-05-14 11:47:16.869533] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:50.583 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:50.842 malloc1 00:10:50.842 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:51.102 [2024-05-14 11:47:17.955531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:51.102 [2024-05-14 11:47:17.955582] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:51.102 [2024-05-14 11:47:17.955606] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8182a0 00:10:51.102 [2024-05-14 11:47:17.955625] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:51.102 [2024-05-14 11:47:17.957420] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:51.102 [2024-05-14 11:47:17.957450] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:51.102 pt1 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:51.102 11:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:51.361 malloc2 00:10:51.361 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:51.361 [2024-05-14 11:47:18.434885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:51.361 [2024-05-14 11:47:18.434927] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:51.361 [2024-05-14 11:47:18.434952] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9cb480 00:10:51.361 [2024-05-14 11:47:18.434965] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:51.361 [2024-05-14 11:47:18.436511] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:51.361 [2024-05-14 11:47:18.436538] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:51.361 pt2 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:51.620 [2024-05-14 11:47:18.679562] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:51.620 [2024-05-14 11:47:18.680931] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:51.620 [2024-05-14 11:47:18.681079] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x9c12c0 00:10:51.620 [2024-05-14 11:47:18.681093] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:51.620 [2024-05-14 11:47:18.681292] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x817f70 00:10:51.620 [2024-05-14 11:47:18.681454] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9c12c0 00:10:51.620 [2024-05-14 11:47:18.681465] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9c12c0 00:10:51.620 [2024-05-14 11:47:18.681567] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.620 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:51.878 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:51.878 "name": "raid_bdev1", 00:10:51.878 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:51.878 "strip_size_kb": 64, 00:10:51.878 "state": "online", 00:10:51.878 "raid_level": "concat", 00:10:51.878 "superblock": true, 00:10:51.878 "num_base_bdevs": 2, 00:10:51.878 "num_base_bdevs_discovered": 2, 00:10:51.878 "num_base_bdevs_operational": 2, 00:10:51.878 "base_bdevs_list": [ 00:10:51.878 { 00:10:51.878 "name": "pt1", 00:10:51.878 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:51.878 "is_configured": true, 00:10:51.878 "data_offset": 2048, 00:10:51.878 "data_size": 63488 00:10:51.878 }, 00:10:51.878 { 00:10:51.878 "name": "pt2", 00:10:51.878 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:51.878 "is_configured": true, 00:10:51.878 "data_offset": 2048, 00:10:51.878 "data_size": 63488 00:10:51.878 } 00:10:51.878 ] 00:10:51.878 }' 00:10:51.878 11:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:51.878 11:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:52.446 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:52.705 [2024-05-14 11:47:19.690425] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:52.705 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:52.705 "name": "raid_bdev1", 00:10:52.705 "aliases": [ 00:10:52.705 "7f8177ae-1870-4dc9-9601-8290875a8560" 00:10:52.705 ], 00:10:52.705 "product_name": "Raid Volume", 00:10:52.705 "block_size": 512, 00:10:52.705 "num_blocks": 126976, 00:10:52.705 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:52.705 "assigned_rate_limits": { 00:10:52.705 "rw_ios_per_sec": 0, 00:10:52.705 "rw_mbytes_per_sec": 0, 00:10:52.705 "r_mbytes_per_sec": 0, 00:10:52.705 "w_mbytes_per_sec": 0 00:10:52.705 }, 00:10:52.705 "claimed": false, 00:10:52.705 "zoned": false, 00:10:52.705 "supported_io_types": { 00:10:52.705 "read": true, 00:10:52.705 "write": true, 00:10:52.705 "unmap": true, 00:10:52.705 "write_zeroes": true, 00:10:52.705 "flush": true, 00:10:52.705 "reset": true, 00:10:52.705 "compare": false, 00:10:52.705 "compare_and_write": false, 00:10:52.705 "abort": false, 00:10:52.705 "nvme_admin": false, 00:10:52.705 "nvme_io": false 00:10:52.705 }, 00:10:52.705 "memory_domains": [ 00:10:52.705 { 00:10:52.705 "dma_device_id": "system", 00:10:52.705 "dma_device_type": 1 00:10:52.705 }, 00:10:52.705 { 00:10:52.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.705 "dma_device_type": 2 00:10:52.705 }, 00:10:52.705 { 00:10:52.705 "dma_device_id": "system", 00:10:52.705 "dma_device_type": 1 00:10:52.705 }, 00:10:52.705 { 00:10:52.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.705 "dma_device_type": 2 00:10:52.705 } 00:10:52.705 ], 00:10:52.705 "driver_specific": { 00:10:52.705 "raid": { 00:10:52.705 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:52.705 "strip_size_kb": 64, 00:10:52.705 "state": "online", 00:10:52.705 "raid_level": "concat", 00:10:52.705 "superblock": true, 00:10:52.705 "num_base_bdevs": 2, 00:10:52.705 "num_base_bdevs_discovered": 2, 00:10:52.705 "num_base_bdevs_operational": 2, 00:10:52.705 "base_bdevs_list": [ 00:10:52.705 { 00:10:52.705 "name": "pt1", 00:10:52.705 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:52.705 "is_configured": true, 00:10:52.705 "data_offset": 2048, 00:10:52.705 "data_size": 63488 00:10:52.705 }, 00:10:52.705 { 00:10:52.705 "name": "pt2", 00:10:52.705 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:52.705 "is_configured": true, 00:10:52.705 "data_offset": 2048, 00:10:52.705 "data_size": 63488 00:10:52.705 } 00:10:52.705 ] 00:10:52.705 } 00:10:52.705 } 00:10:52.705 }' 00:10:52.705 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:52.705 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:52.705 pt2' 00:10:52.705 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:52.705 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:52.706 11:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:52.965 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:52.965 "name": "pt1", 00:10:52.965 "aliases": [ 00:10:52.965 "958e668c-c50b-58e0-80eb-66cb5186cfec" 00:10:52.965 ], 00:10:52.965 "product_name": "passthru", 00:10:52.965 "block_size": 512, 00:10:52.965 "num_blocks": 65536, 00:10:52.965 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:52.965 "assigned_rate_limits": { 00:10:52.965 "rw_ios_per_sec": 0, 00:10:52.965 "rw_mbytes_per_sec": 0, 00:10:52.965 "r_mbytes_per_sec": 0, 00:10:52.965 "w_mbytes_per_sec": 0 00:10:52.965 }, 00:10:52.965 "claimed": true, 00:10:52.965 "claim_type": "exclusive_write", 00:10:52.965 "zoned": false, 00:10:52.965 "supported_io_types": { 00:10:52.965 "read": true, 00:10:52.965 "write": true, 00:10:52.965 "unmap": true, 00:10:52.965 "write_zeroes": true, 00:10:52.965 "flush": true, 00:10:52.965 "reset": true, 00:10:52.965 "compare": false, 00:10:52.965 "compare_and_write": false, 00:10:52.965 "abort": true, 00:10:52.965 "nvme_admin": false, 00:10:52.965 "nvme_io": false 00:10:52.965 }, 00:10:52.965 "memory_domains": [ 00:10:52.965 { 00:10:52.965 "dma_device_id": "system", 00:10:52.965 "dma_device_type": 1 00:10:52.965 }, 00:10:52.965 { 00:10:52.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.965 "dma_device_type": 2 00:10:52.965 } 00:10:52.965 ], 00:10:52.965 "driver_specific": { 00:10:52.965 "passthru": { 00:10:52.965 "name": "pt1", 00:10:52.965 "base_bdev_name": "malloc1" 00:10:52.965 } 00:10:52.965 } 00:10:52.965 }' 00:10:52.965 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:53.223 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:53.482 "name": "pt2", 00:10:53.482 "aliases": [ 00:10:53.482 "44bab209-5d0b-552e-ae4f-0d7d3b52d558" 00:10:53.482 ], 00:10:53.482 "product_name": "passthru", 00:10:53.482 "block_size": 512, 00:10:53.482 "num_blocks": 65536, 00:10:53.482 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:53.482 "assigned_rate_limits": { 00:10:53.482 "rw_ios_per_sec": 0, 00:10:53.482 "rw_mbytes_per_sec": 0, 00:10:53.482 "r_mbytes_per_sec": 0, 00:10:53.482 "w_mbytes_per_sec": 0 00:10:53.482 }, 00:10:53.482 "claimed": true, 00:10:53.482 "claim_type": "exclusive_write", 00:10:53.482 "zoned": false, 00:10:53.482 "supported_io_types": { 00:10:53.482 "read": true, 00:10:53.482 "write": true, 00:10:53.482 "unmap": true, 00:10:53.482 "write_zeroes": true, 00:10:53.482 "flush": true, 00:10:53.482 "reset": true, 00:10:53.482 "compare": false, 00:10:53.482 "compare_and_write": false, 00:10:53.482 "abort": true, 00:10:53.482 "nvme_admin": false, 00:10:53.482 "nvme_io": false 00:10:53.482 }, 00:10:53.482 "memory_domains": [ 00:10:53.482 { 00:10:53.482 "dma_device_id": "system", 00:10:53.482 "dma_device_type": 1 00:10:53.482 }, 00:10:53.482 { 00:10:53.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.482 "dma_device_type": 2 00:10:53.482 } 00:10:53.482 ], 00:10:53.482 "driver_specific": { 00:10:53.482 "passthru": { 00:10:53.482 "name": "pt2", 00:10:53.482 "base_bdev_name": "malloc2" 00:10:53.482 } 00:10:53.482 } 00:10:53.482 }' 00:10:53.482 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:53.742 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:54.001 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:54.001 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:54.001 11:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:10:54.001 [2024-05-14 11:47:21.078112] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:54.260 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=7f8177ae-1870-4dc9-9601-8290875a8560 00:10:54.260 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 7f8177ae-1870-4dc9-9601-8290875a8560 ']' 00:10:54.260 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:54.260 [2024-05-14 11:47:21.318512] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:54.260 [2024-05-14 11:47:21.318534] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:54.260 [2024-05-14 11:47:21.318590] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:54.260 [2024-05-14 11:47:21.318634] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:54.260 [2024-05-14 11:47:21.318645] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9c12c0 name raid_bdev1, state offline 00:10:54.260 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.260 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:10:54.519 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:10:54.519 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:10:54.519 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:54.519 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:54.778 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:54.778 11:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:55.038 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:55.038 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:55.297 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:55.556 [2024-05-14 11:47:22.541699] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:55.556 [2024-05-14 11:47:22.543098] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:55.556 [2024-05-14 11:47:22.543156] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:55.556 [2024-05-14 11:47:22.543196] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:55.556 [2024-05-14 11:47:22.543215] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:55.556 [2024-05-14 11:47:22.543226] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9c1000 name raid_bdev1, state configuring 00:10:55.556 request: 00:10:55.556 { 00:10:55.556 "name": "raid_bdev1", 00:10:55.556 "raid_level": "concat", 00:10:55.556 "base_bdevs": [ 00:10:55.556 "malloc1", 00:10:55.556 "malloc2" 00:10:55.556 ], 00:10:55.556 "superblock": false, 00:10:55.556 "strip_size_kb": 64, 00:10:55.556 "method": "bdev_raid_create", 00:10:55.556 "req_id": 1 00:10:55.556 } 00:10:55.556 Got JSON-RPC error response 00:10:55.556 response: 00:10:55.556 { 00:10:55.556 "code": -17, 00:10:55.556 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:55.556 } 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.556 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:10:55.814 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:10:55.814 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:10:55.814 11:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:56.073 [2024-05-14 11:47:23.030940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:56.073 [2024-05-14 11:47:23.030985] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:56.073 [2024-05-14 11:47:23.031011] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x818c60 00:10:56.073 [2024-05-14 11:47:23.031024] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:56.073 [2024-05-14 11:47:23.032603] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:56.073 [2024-05-14 11:47:23.032631] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:56.073 [2024-05-14 11:47:23.032694] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:10:56.073 [2024-05-14 11:47:23.032718] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:56.073 pt1 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.073 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:56.332 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:56.332 "name": "raid_bdev1", 00:10:56.332 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:56.332 "strip_size_kb": 64, 00:10:56.332 "state": "configuring", 00:10:56.332 "raid_level": "concat", 00:10:56.332 "superblock": true, 00:10:56.332 "num_base_bdevs": 2, 00:10:56.332 "num_base_bdevs_discovered": 1, 00:10:56.332 "num_base_bdevs_operational": 2, 00:10:56.332 "base_bdevs_list": [ 00:10:56.332 { 00:10:56.332 "name": "pt1", 00:10:56.332 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:56.332 "is_configured": true, 00:10:56.332 "data_offset": 2048, 00:10:56.332 "data_size": 63488 00:10:56.332 }, 00:10:56.332 { 00:10:56.332 "name": null, 00:10:56.332 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:56.332 "is_configured": false, 00:10:56.332 "data_offset": 2048, 00:10:56.332 "data_size": 63488 00:10:56.332 } 00:10:56.332 ] 00:10:56.332 }' 00:10:56.332 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:56.332 11:47:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.899 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:10:56.899 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:10:56.899 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:56.899 11:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:57.158 [2024-05-14 11:47:24.053657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:57.158 [2024-05-14 11:47:24.053705] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:57.158 [2024-05-14 11:47:24.053729] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8190d0 00:10:57.158 [2024-05-14 11:47:24.053742] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:57.158 [2024-05-14 11:47:24.054079] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:57.158 [2024-05-14 11:47:24.054096] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:57.158 [2024-05-14 11:47:24.054156] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:10:57.158 [2024-05-14 11:47:24.054175] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:57.158 [2024-05-14 11:47:24.054268] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x812900 00:10:57.158 [2024-05-14 11:47:24.054278] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:57.158 [2024-05-14 11:47:24.054448] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x80ed50 00:10:57.158 [2024-05-14 11:47:24.054572] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x812900 00:10:57.158 [2024-05-14 11:47:24.054581] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x812900 00:10:57.158 [2024-05-14 11:47:24.054676] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:57.158 pt2 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:57.158 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.159 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:57.418 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:57.418 "name": "raid_bdev1", 00:10:57.418 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:57.418 "strip_size_kb": 64, 00:10:57.418 "state": "online", 00:10:57.418 "raid_level": "concat", 00:10:57.418 "superblock": true, 00:10:57.418 "num_base_bdevs": 2, 00:10:57.418 "num_base_bdevs_discovered": 2, 00:10:57.418 "num_base_bdevs_operational": 2, 00:10:57.418 "base_bdevs_list": [ 00:10:57.418 { 00:10:57.418 "name": "pt1", 00:10:57.418 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:57.418 "is_configured": true, 00:10:57.418 "data_offset": 2048, 00:10:57.418 "data_size": 63488 00:10:57.418 }, 00:10:57.418 { 00:10:57.418 "name": "pt2", 00:10:57.418 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:57.418 "is_configured": true, 00:10:57.418 "data_offset": 2048, 00:10:57.418 "data_size": 63488 00:10:57.418 } 00:10:57.418 ] 00:10:57.418 }' 00:10:57.418 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:57.418 11:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:57.986 11:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:58.245 [2024-05-14 11:47:25.144789] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:58.245 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:58.245 "name": "raid_bdev1", 00:10:58.245 "aliases": [ 00:10:58.245 "7f8177ae-1870-4dc9-9601-8290875a8560" 00:10:58.245 ], 00:10:58.245 "product_name": "Raid Volume", 00:10:58.245 "block_size": 512, 00:10:58.245 "num_blocks": 126976, 00:10:58.245 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:58.245 "assigned_rate_limits": { 00:10:58.245 "rw_ios_per_sec": 0, 00:10:58.245 "rw_mbytes_per_sec": 0, 00:10:58.245 "r_mbytes_per_sec": 0, 00:10:58.245 "w_mbytes_per_sec": 0 00:10:58.245 }, 00:10:58.245 "claimed": false, 00:10:58.245 "zoned": false, 00:10:58.245 "supported_io_types": { 00:10:58.245 "read": true, 00:10:58.245 "write": true, 00:10:58.245 "unmap": true, 00:10:58.245 "write_zeroes": true, 00:10:58.245 "flush": true, 00:10:58.245 "reset": true, 00:10:58.245 "compare": false, 00:10:58.245 "compare_and_write": false, 00:10:58.245 "abort": false, 00:10:58.245 "nvme_admin": false, 00:10:58.245 "nvme_io": false 00:10:58.245 }, 00:10:58.245 "memory_domains": [ 00:10:58.245 { 00:10:58.245 "dma_device_id": "system", 00:10:58.245 "dma_device_type": 1 00:10:58.245 }, 00:10:58.245 { 00:10:58.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.245 "dma_device_type": 2 00:10:58.245 }, 00:10:58.245 { 00:10:58.245 "dma_device_id": "system", 00:10:58.245 "dma_device_type": 1 00:10:58.245 }, 00:10:58.245 { 00:10:58.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.245 "dma_device_type": 2 00:10:58.245 } 00:10:58.245 ], 00:10:58.245 "driver_specific": { 00:10:58.245 "raid": { 00:10:58.245 "uuid": "7f8177ae-1870-4dc9-9601-8290875a8560", 00:10:58.245 "strip_size_kb": 64, 00:10:58.245 "state": "online", 00:10:58.245 "raid_level": "concat", 00:10:58.245 "superblock": true, 00:10:58.245 "num_base_bdevs": 2, 00:10:58.245 "num_base_bdevs_discovered": 2, 00:10:58.245 "num_base_bdevs_operational": 2, 00:10:58.245 "base_bdevs_list": [ 00:10:58.245 { 00:10:58.245 "name": "pt1", 00:10:58.245 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:58.245 "is_configured": true, 00:10:58.245 "data_offset": 2048, 00:10:58.245 "data_size": 63488 00:10:58.245 }, 00:10:58.245 { 00:10:58.245 "name": "pt2", 00:10:58.245 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:58.245 "is_configured": true, 00:10:58.245 "data_offset": 2048, 00:10:58.245 "data_size": 63488 00:10:58.245 } 00:10:58.245 ] 00:10:58.245 } 00:10:58.245 } 00:10:58.245 }' 00:10:58.246 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:58.246 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:58.246 pt2' 00:10:58.246 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:58.246 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:58.246 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:58.505 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:58.505 "name": "pt1", 00:10:58.505 "aliases": [ 00:10:58.505 "958e668c-c50b-58e0-80eb-66cb5186cfec" 00:10:58.505 ], 00:10:58.505 "product_name": "passthru", 00:10:58.505 "block_size": 512, 00:10:58.505 "num_blocks": 65536, 00:10:58.505 "uuid": "958e668c-c50b-58e0-80eb-66cb5186cfec", 00:10:58.505 "assigned_rate_limits": { 00:10:58.505 "rw_ios_per_sec": 0, 00:10:58.505 "rw_mbytes_per_sec": 0, 00:10:58.505 "r_mbytes_per_sec": 0, 00:10:58.505 "w_mbytes_per_sec": 0 00:10:58.505 }, 00:10:58.505 "claimed": true, 00:10:58.505 "claim_type": "exclusive_write", 00:10:58.505 "zoned": false, 00:10:58.505 "supported_io_types": { 00:10:58.505 "read": true, 00:10:58.505 "write": true, 00:10:58.505 "unmap": true, 00:10:58.505 "write_zeroes": true, 00:10:58.505 "flush": true, 00:10:58.505 "reset": true, 00:10:58.505 "compare": false, 00:10:58.505 "compare_and_write": false, 00:10:58.505 "abort": true, 00:10:58.505 "nvme_admin": false, 00:10:58.505 "nvme_io": false 00:10:58.505 }, 00:10:58.505 "memory_domains": [ 00:10:58.505 { 00:10:58.505 "dma_device_id": "system", 00:10:58.505 "dma_device_type": 1 00:10:58.505 }, 00:10:58.505 { 00:10:58.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.505 "dma_device_type": 2 00:10:58.505 } 00:10:58.505 ], 00:10:58.505 "driver_specific": { 00:10:58.505 "passthru": { 00:10:58.505 "name": "pt1", 00:10:58.505 "base_bdev_name": "malloc1" 00:10:58.505 } 00:10:58.505 } 00:10:58.505 }' 00:10:58.505 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:58.505 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:58.505 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:58.505 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:58.764 11:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:59.023 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:59.023 "name": "pt2", 00:10:59.023 "aliases": [ 00:10:59.023 "44bab209-5d0b-552e-ae4f-0d7d3b52d558" 00:10:59.023 ], 00:10:59.023 "product_name": "passthru", 00:10:59.023 "block_size": 512, 00:10:59.023 "num_blocks": 65536, 00:10:59.023 "uuid": "44bab209-5d0b-552e-ae4f-0d7d3b52d558", 00:10:59.023 "assigned_rate_limits": { 00:10:59.023 "rw_ios_per_sec": 0, 00:10:59.023 "rw_mbytes_per_sec": 0, 00:10:59.023 "r_mbytes_per_sec": 0, 00:10:59.023 "w_mbytes_per_sec": 0 00:10:59.023 }, 00:10:59.023 "claimed": true, 00:10:59.023 "claim_type": "exclusive_write", 00:10:59.023 "zoned": false, 00:10:59.023 "supported_io_types": { 00:10:59.023 "read": true, 00:10:59.023 "write": true, 00:10:59.023 "unmap": true, 00:10:59.023 "write_zeroes": true, 00:10:59.023 "flush": true, 00:10:59.023 "reset": true, 00:10:59.023 "compare": false, 00:10:59.024 "compare_and_write": false, 00:10:59.024 "abort": true, 00:10:59.024 "nvme_admin": false, 00:10:59.024 "nvme_io": false 00:10:59.024 }, 00:10:59.024 "memory_domains": [ 00:10:59.024 { 00:10:59.024 "dma_device_id": "system", 00:10:59.024 "dma_device_type": 1 00:10:59.024 }, 00:10:59.024 { 00:10:59.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.024 "dma_device_type": 2 00:10:59.024 } 00:10:59.024 ], 00:10:59.024 "driver_specific": { 00:10:59.024 "passthru": { 00:10:59.024 "name": "pt2", 00:10:59.024 "base_bdev_name": "malloc2" 00:10:59.024 } 00:10:59.024 } 00:10:59.024 }' 00:10:59.024 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:59.024 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:59.282 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:59.542 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:59.542 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:59.542 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:10:59.542 [2024-05-14 11:47:26.620696] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 7f8177ae-1870-4dc9-9601-8290875a8560 '!=' 7f8177ae-1870-4dc9-9601-8290875a8560 ']' 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1671539 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1671539 ']' 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1671539 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1671539 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1671539' 00:10:59.800 killing process with pid 1671539 00:10:59.800 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1671539 00:10:59.800 [2024-05-14 11:47:26.696536] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:59.800 [2024-05-14 11:47:26.696597] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:59.800 [2024-05-14 11:47:26.696641] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:59.800 [2024-05-14 11:47:26.696653] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x812900 name raid_bdev1, state offline 00:10:59.801 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1671539 00:10:59.801 [2024-05-14 11:47:26.714252] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:00.060 11:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:11:00.060 00:11:00.060 real 0m10.431s 00:11:00.060 user 0m18.560s 00:11:00.060 sys 0m1.945s 00:11:00.060 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:00.060 11:47:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.060 ************************************ 00:11:00.060 END TEST raid_superblock_test 00:11:00.060 ************************************ 00:11:00.060 11:47:26 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:11:00.060 11:47:26 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:00.060 11:47:26 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:00.060 11:47:26 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:00.060 11:47:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:00.060 ************************************ 00:11:00.060 START TEST raid_state_function_test 00:11:00.060 ************************************ 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 false 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1673134 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1673134' 00:11:00.060 Process raid pid: 1673134 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1673134 /var/tmp/spdk-raid.sock 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1673134 ']' 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:00.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:00.060 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.060 [2024-05-14 11:47:27.098574] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:11:00.060 [2024-05-14 11:47:27.098642] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:00.319 [2024-05-14 11:47:27.231991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:00.319 [2024-05-14 11:47:27.336847] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:00.319 [2024-05-14 11:47:27.395728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:00.319 [2024-05-14 11:47:27.395760] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:00.888 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:00.888 11:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:11:00.888 11:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:01.148 [2024-05-14 11:47:28.121925] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:01.148 [2024-05-14 11:47:28.121974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:01.148 [2024-05-14 11:47:28.121985] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:01.148 [2024-05-14 11:47:28.121997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.148 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.407 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:01.407 "name": "Existed_Raid", 00:11:01.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.407 "strip_size_kb": 0, 00:11:01.407 "state": "configuring", 00:11:01.407 "raid_level": "raid1", 00:11:01.407 "superblock": false, 00:11:01.407 "num_base_bdevs": 2, 00:11:01.407 "num_base_bdevs_discovered": 0, 00:11:01.407 "num_base_bdevs_operational": 2, 00:11:01.407 "base_bdevs_list": [ 00:11:01.407 { 00:11:01.407 "name": "BaseBdev1", 00:11:01.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.407 "is_configured": false, 00:11:01.407 "data_offset": 0, 00:11:01.407 "data_size": 0 00:11:01.407 }, 00:11:01.407 { 00:11:01.407 "name": "BaseBdev2", 00:11:01.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.407 "is_configured": false, 00:11:01.407 "data_offset": 0, 00:11:01.407 "data_size": 0 00:11:01.407 } 00:11:01.407 ] 00:11:01.407 }' 00:11:01.407 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:01.407 11:47:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.975 11:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:02.233 [2024-05-14 11:47:29.152529] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:02.233 [2024-05-14 11:47:29.152562] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeef700 name Existed_Raid, state configuring 00:11:02.233 11:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:02.491 [2024-05-14 11:47:29.397183] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:02.491 [2024-05-14 11:47:29.397215] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:02.491 [2024-05-14 11:47:29.397225] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:02.491 [2024-05-14 11:47:29.397237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:02.491 11:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:02.749 [2024-05-14 11:47:29.655791] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:02.749 BaseBdev1 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:02.749 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:03.008 11:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:03.267 [ 00:11:03.267 { 00:11:03.267 "name": "BaseBdev1", 00:11:03.267 "aliases": [ 00:11:03.267 "8267f9da-1cff-4ced-82d0-561910922779" 00:11:03.267 ], 00:11:03.267 "product_name": "Malloc disk", 00:11:03.267 "block_size": 512, 00:11:03.267 "num_blocks": 65536, 00:11:03.267 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:03.267 "assigned_rate_limits": { 00:11:03.267 "rw_ios_per_sec": 0, 00:11:03.267 "rw_mbytes_per_sec": 0, 00:11:03.267 "r_mbytes_per_sec": 0, 00:11:03.267 "w_mbytes_per_sec": 0 00:11:03.267 }, 00:11:03.267 "claimed": true, 00:11:03.267 "claim_type": "exclusive_write", 00:11:03.267 "zoned": false, 00:11:03.267 "supported_io_types": { 00:11:03.267 "read": true, 00:11:03.267 "write": true, 00:11:03.267 "unmap": true, 00:11:03.267 "write_zeroes": true, 00:11:03.267 "flush": true, 00:11:03.267 "reset": true, 00:11:03.267 "compare": false, 00:11:03.267 "compare_and_write": false, 00:11:03.267 "abort": true, 00:11:03.267 "nvme_admin": false, 00:11:03.267 "nvme_io": false 00:11:03.267 }, 00:11:03.267 "memory_domains": [ 00:11:03.267 { 00:11:03.267 "dma_device_id": "system", 00:11:03.267 "dma_device_type": 1 00:11:03.267 }, 00:11:03.267 { 00:11:03.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:03.267 "dma_device_type": 2 00:11:03.267 } 00:11:03.267 ], 00:11:03.267 "driver_specific": {} 00:11:03.267 } 00:11:03.267 ] 00:11:03.267 11:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.268 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.527 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:03.527 "name": "Existed_Raid", 00:11:03.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.527 "strip_size_kb": 0, 00:11:03.527 "state": "configuring", 00:11:03.527 "raid_level": "raid1", 00:11:03.527 "superblock": false, 00:11:03.527 "num_base_bdevs": 2, 00:11:03.527 "num_base_bdevs_discovered": 1, 00:11:03.527 "num_base_bdevs_operational": 2, 00:11:03.527 "base_bdevs_list": [ 00:11:03.527 { 00:11:03.527 "name": "BaseBdev1", 00:11:03.527 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:03.527 "is_configured": true, 00:11:03.527 "data_offset": 0, 00:11:03.527 "data_size": 65536 00:11:03.527 }, 00:11:03.527 { 00:11:03.527 "name": "BaseBdev2", 00:11:03.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.527 "is_configured": false, 00:11:03.527 "data_offset": 0, 00:11:03.527 "data_size": 0 00:11:03.527 } 00:11:03.527 ] 00:11:03.527 }' 00:11:03.527 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:03.527 11:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.157 11:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:04.417 [2024-05-14 11:47:31.204009] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:04.417 [2024-05-14 11:47:31.204054] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeef9a0 name Existed_Raid, state configuring 00:11:04.417 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:04.417 [2024-05-14 11:47:31.448694] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:04.417 [2024-05-14 11:47:31.450260] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:04.417 [2024-05-14 11:47:31.450296] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:04.417 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:04.417 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.418 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.676 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:04.676 "name": "Existed_Raid", 00:11:04.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.676 "strip_size_kb": 0, 00:11:04.676 "state": "configuring", 00:11:04.676 "raid_level": "raid1", 00:11:04.676 "superblock": false, 00:11:04.676 "num_base_bdevs": 2, 00:11:04.676 "num_base_bdevs_discovered": 1, 00:11:04.676 "num_base_bdevs_operational": 2, 00:11:04.676 "base_bdevs_list": [ 00:11:04.676 { 00:11:04.676 "name": "BaseBdev1", 00:11:04.676 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:04.676 "is_configured": true, 00:11:04.676 "data_offset": 0, 00:11:04.676 "data_size": 65536 00:11:04.676 }, 00:11:04.676 { 00:11:04.676 "name": "BaseBdev2", 00:11:04.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.676 "is_configured": false, 00:11:04.676 "data_offset": 0, 00:11:04.676 "data_size": 0 00:11:04.676 } 00:11:04.676 ] 00:11:04.676 }' 00:11:04.676 11:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:04.676 11:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.244 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:05.503 [2024-05-14 11:47:32.454857] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:05.503 [2024-05-14 11:47:32.454898] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xeeeff0 00:11:05.503 [2024-05-14 11:47:32.454907] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:05.503 [2024-05-14 11:47:32.455103] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef1810 00:11:05.503 [2024-05-14 11:47:32.455230] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeeeff0 00:11:05.503 [2024-05-14 11:47:32.455240] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xeeeff0 00:11:05.503 [2024-05-14 11:47:32.455429] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:05.503 BaseBdev2 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:05.503 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:05.763 [ 00:11:05.763 { 00:11:05.763 "name": "BaseBdev2", 00:11:05.763 "aliases": [ 00:11:05.763 "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab" 00:11:05.763 ], 00:11:05.763 "product_name": "Malloc disk", 00:11:05.763 "block_size": 512, 00:11:05.763 "num_blocks": 65536, 00:11:05.763 "uuid": "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab", 00:11:05.763 "assigned_rate_limits": { 00:11:05.763 "rw_ios_per_sec": 0, 00:11:05.763 "rw_mbytes_per_sec": 0, 00:11:05.763 "r_mbytes_per_sec": 0, 00:11:05.763 "w_mbytes_per_sec": 0 00:11:05.763 }, 00:11:05.763 "claimed": true, 00:11:05.763 "claim_type": "exclusive_write", 00:11:05.763 "zoned": false, 00:11:05.763 "supported_io_types": { 00:11:05.763 "read": true, 00:11:05.763 "write": true, 00:11:05.763 "unmap": true, 00:11:05.763 "write_zeroes": true, 00:11:05.763 "flush": true, 00:11:05.763 "reset": true, 00:11:05.763 "compare": false, 00:11:05.763 "compare_and_write": false, 00:11:05.763 "abort": true, 00:11:05.763 "nvme_admin": false, 00:11:05.763 "nvme_io": false 00:11:05.763 }, 00:11:05.763 "memory_domains": [ 00:11:05.763 { 00:11:05.763 "dma_device_id": "system", 00:11:05.763 "dma_device_type": 1 00:11:05.763 }, 00:11:05.763 { 00:11:05.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.763 "dma_device_type": 2 00:11:05.763 } 00:11:05.763 ], 00:11:05.763 "driver_specific": {} 00:11:05.763 } 00:11:05.763 ] 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:05.763 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:06.022 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.022 11:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.022 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:06.022 "name": "Existed_Raid", 00:11:06.022 "uuid": "489b8691-5709-4e2b-bb2c-2224e69a123f", 00:11:06.022 "strip_size_kb": 0, 00:11:06.022 "state": "online", 00:11:06.022 "raid_level": "raid1", 00:11:06.022 "superblock": false, 00:11:06.022 "num_base_bdevs": 2, 00:11:06.022 "num_base_bdevs_discovered": 2, 00:11:06.022 "num_base_bdevs_operational": 2, 00:11:06.022 "base_bdevs_list": [ 00:11:06.022 { 00:11:06.022 "name": "BaseBdev1", 00:11:06.022 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:06.022 "is_configured": true, 00:11:06.022 "data_offset": 0, 00:11:06.022 "data_size": 65536 00:11:06.022 }, 00:11:06.022 { 00:11:06.022 "name": "BaseBdev2", 00:11:06.022 "uuid": "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab", 00:11:06.022 "is_configured": true, 00:11:06.022 "data_offset": 0, 00:11:06.022 "data_size": 65536 00:11:06.022 } 00:11:06.022 ] 00:11:06.022 }' 00:11:06.022 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:06.022 11:47:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:06.961 [2024-05-14 11:47:33.906966] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:06.961 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:06.961 "name": "Existed_Raid", 00:11:06.961 "aliases": [ 00:11:06.961 "489b8691-5709-4e2b-bb2c-2224e69a123f" 00:11:06.961 ], 00:11:06.961 "product_name": "Raid Volume", 00:11:06.961 "block_size": 512, 00:11:06.961 "num_blocks": 65536, 00:11:06.961 "uuid": "489b8691-5709-4e2b-bb2c-2224e69a123f", 00:11:06.961 "assigned_rate_limits": { 00:11:06.961 "rw_ios_per_sec": 0, 00:11:06.961 "rw_mbytes_per_sec": 0, 00:11:06.961 "r_mbytes_per_sec": 0, 00:11:06.961 "w_mbytes_per_sec": 0 00:11:06.961 }, 00:11:06.961 "claimed": false, 00:11:06.961 "zoned": false, 00:11:06.961 "supported_io_types": { 00:11:06.961 "read": true, 00:11:06.961 "write": true, 00:11:06.961 "unmap": false, 00:11:06.961 "write_zeroes": true, 00:11:06.961 "flush": false, 00:11:06.961 "reset": true, 00:11:06.961 "compare": false, 00:11:06.961 "compare_and_write": false, 00:11:06.961 "abort": false, 00:11:06.961 "nvme_admin": false, 00:11:06.961 "nvme_io": false 00:11:06.961 }, 00:11:06.961 "memory_domains": [ 00:11:06.961 { 00:11:06.961 "dma_device_id": "system", 00:11:06.961 "dma_device_type": 1 00:11:06.961 }, 00:11:06.961 { 00:11:06.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.961 "dma_device_type": 2 00:11:06.961 }, 00:11:06.961 { 00:11:06.961 "dma_device_id": "system", 00:11:06.961 "dma_device_type": 1 00:11:06.961 }, 00:11:06.961 { 00:11:06.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.961 "dma_device_type": 2 00:11:06.961 } 00:11:06.961 ], 00:11:06.961 "driver_specific": { 00:11:06.961 "raid": { 00:11:06.961 "uuid": "489b8691-5709-4e2b-bb2c-2224e69a123f", 00:11:06.961 "strip_size_kb": 0, 00:11:06.961 "state": "online", 00:11:06.961 "raid_level": "raid1", 00:11:06.961 "superblock": false, 00:11:06.961 "num_base_bdevs": 2, 00:11:06.961 "num_base_bdevs_discovered": 2, 00:11:06.961 "num_base_bdevs_operational": 2, 00:11:06.961 "base_bdevs_list": [ 00:11:06.961 { 00:11:06.961 "name": "BaseBdev1", 00:11:06.961 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:06.961 "is_configured": true, 00:11:06.961 "data_offset": 0, 00:11:06.961 "data_size": 65536 00:11:06.961 }, 00:11:06.961 { 00:11:06.961 "name": "BaseBdev2", 00:11:06.961 "uuid": "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab", 00:11:06.961 "is_configured": true, 00:11:06.961 "data_offset": 0, 00:11:06.961 "data_size": 65536 00:11:06.961 } 00:11:06.961 ] 00:11:06.961 } 00:11:06.961 } 00:11:06.961 }' 00:11:06.962 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:06.962 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:06.962 BaseBdev2' 00:11:06.962 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:06.962 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:06.962 11:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:07.220 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:07.220 "name": "BaseBdev1", 00:11:07.220 "aliases": [ 00:11:07.220 "8267f9da-1cff-4ced-82d0-561910922779" 00:11:07.220 ], 00:11:07.220 "product_name": "Malloc disk", 00:11:07.220 "block_size": 512, 00:11:07.220 "num_blocks": 65536, 00:11:07.220 "uuid": "8267f9da-1cff-4ced-82d0-561910922779", 00:11:07.220 "assigned_rate_limits": { 00:11:07.220 "rw_ios_per_sec": 0, 00:11:07.220 "rw_mbytes_per_sec": 0, 00:11:07.220 "r_mbytes_per_sec": 0, 00:11:07.220 "w_mbytes_per_sec": 0 00:11:07.220 }, 00:11:07.220 "claimed": true, 00:11:07.220 "claim_type": "exclusive_write", 00:11:07.220 "zoned": false, 00:11:07.220 "supported_io_types": { 00:11:07.220 "read": true, 00:11:07.221 "write": true, 00:11:07.221 "unmap": true, 00:11:07.221 "write_zeroes": true, 00:11:07.221 "flush": true, 00:11:07.221 "reset": true, 00:11:07.221 "compare": false, 00:11:07.221 "compare_and_write": false, 00:11:07.221 "abort": true, 00:11:07.221 "nvme_admin": false, 00:11:07.221 "nvme_io": false 00:11:07.221 }, 00:11:07.221 "memory_domains": [ 00:11:07.221 { 00:11:07.221 "dma_device_id": "system", 00:11:07.221 "dma_device_type": 1 00:11:07.221 }, 00:11:07.221 { 00:11:07.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.221 "dma_device_type": 2 00:11:07.221 } 00:11:07.221 ], 00:11:07.221 "driver_specific": {} 00:11:07.221 }' 00:11:07.221 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.221 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.221 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:07.221 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:07.480 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:07.739 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:07.739 "name": "BaseBdev2", 00:11:07.739 "aliases": [ 00:11:07.739 "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab" 00:11:07.739 ], 00:11:07.739 "product_name": "Malloc disk", 00:11:07.739 "block_size": 512, 00:11:07.739 "num_blocks": 65536, 00:11:07.739 "uuid": "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab", 00:11:07.739 "assigned_rate_limits": { 00:11:07.739 "rw_ios_per_sec": 0, 00:11:07.739 "rw_mbytes_per_sec": 0, 00:11:07.739 "r_mbytes_per_sec": 0, 00:11:07.739 "w_mbytes_per_sec": 0 00:11:07.739 }, 00:11:07.739 "claimed": true, 00:11:07.739 "claim_type": "exclusive_write", 00:11:07.739 "zoned": false, 00:11:07.739 "supported_io_types": { 00:11:07.739 "read": true, 00:11:07.739 "write": true, 00:11:07.739 "unmap": true, 00:11:07.739 "write_zeroes": true, 00:11:07.739 "flush": true, 00:11:07.739 "reset": true, 00:11:07.739 "compare": false, 00:11:07.739 "compare_and_write": false, 00:11:07.739 "abort": true, 00:11:07.739 "nvme_admin": false, 00:11:07.739 "nvme_io": false 00:11:07.739 }, 00:11:07.739 "memory_domains": [ 00:11:07.739 { 00:11:07.739 "dma_device_id": "system", 00:11:07.739 "dma_device_type": 1 00:11:07.739 }, 00:11:07.739 { 00:11:07.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.739 "dma_device_type": 2 00:11:07.739 } 00:11:07.739 ], 00:11:07.739 "driver_specific": {} 00:11:07.739 }' 00:11:07.739 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.998 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.998 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:07.998 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.998 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.999 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:07.999 11:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.999 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.999 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:07.999 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:08.258 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:08.258 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:08.258 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:08.517 [2024-05-14 11:47:35.366659] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:08.517 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.776 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:08.776 "name": "Existed_Raid", 00:11:08.776 "uuid": "489b8691-5709-4e2b-bb2c-2224e69a123f", 00:11:08.776 "strip_size_kb": 0, 00:11:08.776 "state": "online", 00:11:08.776 "raid_level": "raid1", 00:11:08.776 "superblock": false, 00:11:08.776 "num_base_bdevs": 2, 00:11:08.776 "num_base_bdevs_discovered": 1, 00:11:08.776 "num_base_bdevs_operational": 1, 00:11:08.776 "base_bdevs_list": [ 00:11:08.776 { 00:11:08.776 "name": null, 00:11:08.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:08.776 "is_configured": false, 00:11:08.776 "data_offset": 0, 00:11:08.776 "data_size": 65536 00:11:08.776 }, 00:11:08.776 { 00:11:08.776 "name": "BaseBdev2", 00:11:08.776 "uuid": "b53b7a1a-7ff4-4009-bde4-fbe3bb8b17ab", 00:11:08.776 "is_configured": true, 00:11:08.776 "data_offset": 0, 00:11:08.776 "data_size": 65536 00:11:08.776 } 00:11:08.776 ] 00:11:08.776 }' 00:11:08.776 11:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:08.776 11:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.345 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:09.345 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:09.345 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.345 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:09.604 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:09.604 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:09.604 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:09.863 [2024-05-14 11:47:36.699650] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:09.863 [2024-05-14 11:47:36.699732] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.863 [2024-05-14 11:47:36.712490] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.863 [2024-05-14 11:47:36.712562] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:09.863 [2024-05-14 11:47:36.712576] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeeeff0 name Existed_Raid, state offline 00:11:09.863 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:09.863 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:09.863 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.863 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:10.121 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:10.121 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:10.121 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:10.121 11:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1673134 00:11:10.121 11:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1673134 ']' 00:11:10.122 11:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1673134 00:11:10.122 11:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:11:10.122 11:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:10.122 11:47:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1673134 00:11:10.122 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:10.122 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:10.122 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1673134' 00:11:10.122 killing process with pid 1673134 00:11:10.122 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1673134 00:11:10.122 [2024-05-14 11:47:37.019579] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.122 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1673134 00:11:10.122 [2024-05-14 11:47:37.020471] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.380 11:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:11:10.380 00:11:10.380 real 0m10.196s 00:11:10.380 user 0m18.082s 00:11:10.380 sys 0m1.933s 00:11:10.380 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:10.380 11:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.380 ************************************ 00:11:10.380 END TEST raid_state_function_test 00:11:10.380 ************************************ 00:11:10.380 11:47:37 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:10.380 11:47:37 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:10.380 11:47:37 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:10.380 11:47:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:10.380 ************************************ 00:11:10.380 START TEST raid_state_function_test_sb 00:11:10.380 ************************************ 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1674761 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1674761' 00:11:10.381 Process raid pid: 1674761 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1674761 /var/tmp/spdk-raid.sock 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1674761 ']' 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:10.381 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:10.381 11:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:10.381 [2024-05-14 11:47:37.378980] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:11:10.381 [2024-05-14 11:47:37.379047] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:10.639 [2024-05-14 11:47:37.500394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:10.639 [2024-05-14 11:47:37.601713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:10.639 [2024-05-14 11:47:37.661459] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:10.639 [2024-05-14 11:47:37.661497] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.206 11:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:11.206 11:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:11:11.206 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:11.464 [2024-05-14 11:47:38.459593] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:11.464 [2024-05-14 11:47:38.459635] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:11.464 [2024-05-14 11:47:38.459647] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:11.464 [2024-05-14 11:47:38.459659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.464 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:11.723 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:11.723 "name": "Existed_Raid", 00:11:11.723 "uuid": "5ae16566-2ed2-4267-8773-48e513e9ba95", 00:11:11.723 "strip_size_kb": 0, 00:11:11.723 "state": "configuring", 00:11:11.723 "raid_level": "raid1", 00:11:11.723 "superblock": true, 00:11:11.723 "num_base_bdevs": 2, 00:11:11.723 "num_base_bdevs_discovered": 0, 00:11:11.723 "num_base_bdevs_operational": 2, 00:11:11.723 "base_bdevs_list": [ 00:11:11.723 { 00:11:11.723 "name": "BaseBdev1", 00:11:11.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.723 "is_configured": false, 00:11:11.723 "data_offset": 0, 00:11:11.723 "data_size": 0 00:11:11.723 }, 00:11:11.723 { 00:11:11.723 "name": "BaseBdev2", 00:11:11.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.723 "is_configured": false, 00:11:11.723 "data_offset": 0, 00:11:11.723 "data_size": 0 00:11:11.723 } 00:11:11.723 ] 00:11:11.723 }' 00:11:11.723 11:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:11.723 11:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:12.312 11:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:12.568 [2024-05-14 11:47:39.534278] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:12.568 [2024-05-14 11:47:39.534308] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac7700 name Existed_Raid, state configuring 00:11:12.568 11:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:12.827 [2024-05-14 11:47:39.778938] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:12.827 [2024-05-14 11:47:39.778971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:12.827 [2024-05-14 11:47:39.778981] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:12.827 [2024-05-14 11:47:39.778992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:12.827 11:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:13.084 [2024-05-14 11:47:40.033455] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:13.084 BaseBdev1 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:13.084 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:13.370 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:13.627 [ 00:11:13.627 { 00:11:13.627 "name": "BaseBdev1", 00:11:13.627 "aliases": [ 00:11:13.627 "84ca1a69-85d8-4dd2-a797-dd33d181f2f2" 00:11:13.627 ], 00:11:13.627 "product_name": "Malloc disk", 00:11:13.627 "block_size": 512, 00:11:13.627 "num_blocks": 65536, 00:11:13.627 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:13.627 "assigned_rate_limits": { 00:11:13.627 "rw_ios_per_sec": 0, 00:11:13.627 "rw_mbytes_per_sec": 0, 00:11:13.627 "r_mbytes_per_sec": 0, 00:11:13.627 "w_mbytes_per_sec": 0 00:11:13.627 }, 00:11:13.627 "claimed": true, 00:11:13.627 "claim_type": "exclusive_write", 00:11:13.627 "zoned": false, 00:11:13.627 "supported_io_types": { 00:11:13.627 "read": true, 00:11:13.627 "write": true, 00:11:13.627 "unmap": true, 00:11:13.627 "write_zeroes": true, 00:11:13.627 "flush": true, 00:11:13.627 "reset": true, 00:11:13.627 "compare": false, 00:11:13.627 "compare_and_write": false, 00:11:13.627 "abort": true, 00:11:13.627 "nvme_admin": false, 00:11:13.627 "nvme_io": false 00:11:13.627 }, 00:11:13.627 "memory_domains": [ 00:11:13.627 { 00:11:13.627 "dma_device_id": "system", 00:11:13.627 "dma_device_type": 1 00:11:13.627 }, 00:11:13.627 { 00:11:13.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.627 "dma_device_type": 2 00:11:13.627 } 00:11:13.627 ], 00:11:13.627 "driver_specific": {} 00:11:13.627 } 00:11:13.627 ] 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.627 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.885 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:13.885 "name": "Existed_Raid", 00:11:13.885 "uuid": "1ad6160b-033c-4dc4-b9cf-d9fc82a6b24f", 00:11:13.885 "strip_size_kb": 0, 00:11:13.885 "state": "configuring", 00:11:13.885 "raid_level": "raid1", 00:11:13.885 "superblock": true, 00:11:13.885 "num_base_bdevs": 2, 00:11:13.885 "num_base_bdevs_discovered": 1, 00:11:13.885 "num_base_bdevs_operational": 2, 00:11:13.885 "base_bdevs_list": [ 00:11:13.885 { 00:11:13.885 "name": "BaseBdev1", 00:11:13.885 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:13.885 "is_configured": true, 00:11:13.885 "data_offset": 2048, 00:11:13.885 "data_size": 63488 00:11:13.885 }, 00:11:13.885 { 00:11:13.885 "name": "BaseBdev2", 00:11:13.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.885 "is_configured": false, 00:11:13.885 "data_offset": 0, 00:11:13.885 "data_size": 0 00:11:13.885 } 00:11:13.885 ] 00:11:13.885 }' 00:11:13.885 11:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:13.885 11:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:14.451 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:14.709 [2024-05-14 11:47:41.593570] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:14.709 [2024-05-14 11:47:41.593608] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac79a0 name Existed_Raid, state configuring 00:11:14.709 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:14.967 [2024-05-14 11:47:41.838249] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:14.967 [2024-05-14 11:47:41.839734] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:14.967 [2024-05-14 11:47:41.839767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.967 11:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.226 11:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:15.226 "name": "Existed_Raid", 00:11:15.226 "uuid": "faae989e-a3c8-46ce-afce-60691b7fcd4f", 00:11:15.226 "strip_size_kb": 0, 00:11:15.226 "state": "configuring", 00:11:15.226 "raid_level": "raid1", 00:11:15.226 "superblock": true, 00:11:15.226 "num_base_bdevs": 2, 00:11:15.226 "num_base_bdevs_discovered": 1, 00:11:15.226 "num_base_bdevs_operational": 2, 00:11:15.226 "base_bdevs_list": [ 00:11:15.226 { 00:11:15.226 "name": "BaseBdev1", 00:11:15.226 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:15.226 "is_configured": true, 00:11:15.226 "data_offset": 2048, 00:11:15.226 "data_size": 63488 00:11:15.226 }, 00:11:15.226 { 00:11:15.226 "name": "BaseBdev2", 00:11:15.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.226 "is_configured": false, 00:11:15.226 "data_offset": 0, 00:11:15.226 "data_size": 0 00:11:15.226 } 00:11:15.226 ] 00:11:15.226 }' 00:11:15.226 11:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:15.226 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:15.793 [2024-05-14 11:47:42.844296] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:15.793 [2024-05-14 11:47:42.844457] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xac6ff0 00:11:15.793 [2024-05-14 11:47:42.844472] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:15.793 [2024-05-14 11:47:42.844643] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xac9810 00:11:15.793 [2024-05-14 11:47:42.844771] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xac6ff0 00:11:15.793 [2024-05-14 11:47:42.844781] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xac6ff0 00:11:15.793 [2024-05-14 11:47:42.844873] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.793 BaseBdev2 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:15.793 11:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:16.051 11:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:16.310 [ 00:11:16.310 { 00:11:16.310 "name": "BaseBdev2", 00:11:16.310 "aliases": [ 00:11:16.310 "6bdf9a0e-d0e0-469f-9778-098e1d84b806" 00:11:16.310 ], 00:11:16.310 "product_name": "Malloc disk", 00:11:16.310 "block_size": 512, 00:11:16.310 "num_blocks": 65536, 00:11:16.310 "uuid": "6bdf9a0e-d0e0-469f-9778-098e1d84b806", 00:11:16.310 "assigned_rate_limits": { 00:11:16.310 "rw_ios_per_sec": 0, 00:11:16.310 "rw_mbytes_per_sec": 0, 00:11:16.310 "r_mbytes_per_sec": 0, 00:11:16.310 "w_mbytes_per_sec": 0 00:11:16.310 }, 00:11:16.310 "claimed": true, 00:11:16.310 "claim_type": "exclusive_write", 00:11:16.310 "zoned": false, 00:11:16.310 "supported_io_types": { 00:11:16.310 "read": true, 00:11:16.310 "write": true, 00:11:16.310 "unmap": true, 00:11:16.310 "write_zeroes": true, 00:11:16.310 "flush": true, 00:11:16.310 "reset": true, 00:11:16.310 "compare": false, 00:11:16.310 "compare_and_write": false, 00:11:16.310 "abort": true, 00:11:16.310 "nvme_admin": false, 00:11:16.310 "nvme_io": false 00:11:16.310 }, 00:11:16.310 "memory_domains": [ 00:11:16.310 { 00:11:16.310 "dma_device_id": "system", 00:11:16.310 "dma_device_type": 1 00:11:16.310 }, 00:11:16.310 { 00:11:16.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.310 "dma_device_type": 2 00:11:16.310 } 00:11:16.310 ], 00:11:16.310 "driver_specific": {} 00:11:16.310 } 00:11:16.310 ] 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.310 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:16.568 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:16.568 "name": "Existed_Raid", 00:11:16.568 "uuid": "faae989e-a3c8-46ce-afce-60691b7fcd4f", 00:11:16.568 "strip_size_kb": 0, 00:11:16.568 "state": "online", 00:11:16.568 "raid_level": "raid1", 00:11:16.568 "superblock": true, 00:11:16.568 "num_base_bdevs": 2, 00:11:16.568 "num_base_bdevs_discovered": 2, 00:11:16.568 "num_base_bdevs_operational": 2, 00:11:16.568 "base_bdevs_list": [ 00:11:16.568 { 00:11:16.568 "name": "BaseBdev1", 00:11:16.568 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:16.568 "is_configured": true, 00:11:16.568 "data_offset": 2048, 00:11:16.568 "data_size": 63488 00:11:16.568 }, 00:11:16.568 { 00:11:16.568 "name": "BaseBdev2", 00:11:16.568 "uuid": "6bdf9a0e-d0e0-469f-9778-098e1d84b806", 00:11:16.568 "is_configured": true, 00:11:16.569 "data_offset": 2048, 00:11:16.569 "data_size": 63488 00:11:16.569 } 00:11:16.569 ] 00:11:16.569 }' 00:11:16.569 11:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:16.569 11:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:17.137 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:17.396 [2024-05-14 11:47:44.420729] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:17.396 "name": "Existed_Raid", 00:11:17.396 "aliases": [ 00:11:17.396 "faae989e-a3c8-46ce-afce-60691b7fcd4f" 00:11:17.396 ], 00:11:17.396 "product_name": "Raid Volume", 00:11:17.396 "block_size": 512, 00:11:17.396 "num_blocks": 63488, 00:11:17.396 "uuid": "faae989e-a3c8-46ce-afce-60691b7fcd4f", 00:11:17.396 "assigned_rate_limits": { 00:11:17.396 "rw_ios_per_sec": 0, 00:11:17.396 "rw_mbytes_per_sec": 0, 00:11:17.396 "r_mbytes_per_sec": 0, 00:11:17.396 "w_mbytes_per_sec": 0 00:11:17.396 }, 00:11:17.396 "claimed": false, 00:11:17.396 "zoned": false, 00:11:17.396 "supported_io_types": { 00:11:17.396 "read": true, 00:11:17.396 "write": true, 00:11:17.396 "unmap": false, 00:11:17.396 "write_zeroes": true, 00:11:17.396 "flush": false, 00:11:17.396 "reset": true, 00:11:17.396 "compare": false, 00:11:17.396 "compare_and_write": false, 00:11:17.396 "abort": false, 00:11:17.396 "nvme_admin": false, 00:11:17.396 "nvme_io": false 00:11:17.396 }, 00:11:17.396 "memory_domains": [ 00:11:17.396 { 00:11:17.396 "dma_device_id": "system", 00:11:17.396 "dma_device_type": 1 00:11:17.396 }, 00:11:17.396 { 00:11:17.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.396 "dma_device_type": 2 00:11:17.396 }, 00:11:17.396 { 00:11:17.396 "dma_device_id": "system", 00:11:17.396 "dma_device_type": 1 00:11:17.396 }, 00:11:17.396 { 00:11:17.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.396 "dma_device_type": 2 00:11:17.396 } 00:11:17.396 ], 00:11:17.396 "driver_specific": { 00:11:17.396 "raid": { 00:11:17.396 "uuid": "faae989e-a3c8-46ce-afce-60691b7fcd4f", 00:11:17.396 "strip_size_kb": 0, 00:11:17.396 "state": "online", 00:11:17.396 "raid_level": "raid1", 00:11:17.396 "superblock": true, 00:11:17.396 "num_base_bdevs": 2, 00:11:17.396 "num_base_bdevs_discovered": 2, 00:11:17.396 "num_base_bdevs_operational": 2, 00:11:17.396 "base_bdevs_list": [ 00:11:17.396 { 00:11:17.396 "name": "BaseBdev1", 00:11:17.396 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:17.396 "is_configured": true, 00:11:17.396 "data_offset": 2048, 00:11:17.396 "data_size": 63488 00:11:17.396 }, 00:11:17.396 { 00:11:17.396 "name": "BaseBdev2", 00:11:17.396 "uuid": "6bdf9a0e-d0e0-469f-9778-098e1d84b806", 00:11:17.396 "is_configured": true, 00:11:17.396 "data_offset": 2048, 00:11:17.396 "data_size": 63488 00:11:17.396 } 00:11:17.396 ] 00:11:17.396 } 00:11:17.396 } 00:11:17.396 }' 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:17.396 BaseBdev2' 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:17.396 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:17.656 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:17.656 "name": "BaseBdev1", 00:11:17.656 "aliases": [ 00:11:17.656 "84ca1a69-85d8-4dd2-a797-dd33d181f2f2" 00:11:17.656 ], 00:11:17.656 "product_name": "Malloc disk", 00:11:17.656 "block_size": 512, 00:11:17.656 "num_blocks": 65536, 00:11:17.656 "uuid": "84ca1a69-85d8-4dd2-a797-dd33d181f2f2", 00:11:17.656 "assigned_rate_limits": { 00:11:17.656 "rw_ios_per_sec": 0, 00:11:17.656 "rw_mbytes_per_sec": 0, 00:11:17.656 "r_mbytes_per_sec": 0, 00:11:17.656 "w_mbytes_per_sec": 0 00:11:17.656 }, 00:11:17.656 "claimed": true, 00:11:17.656 "claim_type": "exclusive_write", 00:11:17.656 "zoned": false, 00:11:17.656 "supported_io_types": { 00:11:17.656 "read": true, 00:11:17.656 "write": true, 00:11:17.656 "unmap": true, 00:11:17.656 "write_zeroes": true, 00:11:17.656 "flush": true, 00:11:17.656 "reset": true, 00:11:17.656 "compare": false, 00:11:17.656 "compare_and_write": false, 00:11:17.656 "abort": true, 00:11:17.656 "nvme_admin": false, 00:11:17.656 "nvme_io": false 00:11:17.656 }, 00:11:17.656 "memory_domains": [ 00:11:17.656 { 00:11:17.656 "dma_device_id": "system", 00:11:17.656 "dma_device_type": 1 00:11:17.656 }, 00:11:17.656 { 00:11:17.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.656 "dma_device_type": 2 00:11:17.656 } 00:11:17.656 ], 00:11:17.656 "driver_specific": {} 00:11:17.656 }' 00:11:17.656 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:17.915 11:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:18.173 "name": "BaseBdev2", 00:11:18.173 "aliases": [ 00:11:18.173 "6bdf9a0e-d0e0-469f-9778-098e1d84b806" 00:11:18.173 ], 00:11:18.173 "product_name": "Malloc disk", 00:11:18.173 "block_size": 512, 00:11:18.173 "num_blocks": 65536, 00:11:18.173 "uuid": "6bdf9a0e-d0e0-469f-9778-098e1d84b806", 00:11:18.173 "assigned_rate_limits": { 00:11:18.173 "rw_ios_per_sec": 0, 00:11:18.173 "rw_mbytes_per_sec": 0, 00:11:18.173 "r_mbytes_per_sec": 0, 00:11:18.173 "w_mbytes_per_sec": 0 00:11:18.173 }, 00:11:18.173 "claimed": true, 00:11:18.173 "claim_type": "exclusive_write", 00:11:18.173 "zoned": false, 00:11:18.173 "supported_io_types": { 00:11:18.173 "read": true, 00:11:18.173 "write": true, 00:11:18.173 "unmap": true, 00:11:18.173 "write_zeroes": true, 00:11:18.173 "flush": true, 00:11:18.173 "reset": true, 00:11:18.173 "compare": false, 00:11:18.173 "compare_and_write": false, 00:11:18.173 "abort": true, 00:11:18.173 "nvme_admin": false, 00:11:18.173 "nvme_io": false 00:11:18.173 }, 00:11:18.173 "memory_domains": [ 00:11:18.173 { 00:11:18.173 "dma_device_id": "system", 00:11:18.173 "dma_device_type": 1 00:11:18.173 }, 00:11:18.173 { 00:11:18.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.173 "dma_device_type": 2 00:11:18.173 } 00:11:18.173 ], 00:11:18.173 "driver_specific": {} 00:11:18.173 }' 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:18.173 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:18.432 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:18.691 [2024-05-14 11:47:45.667850] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.691 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.949 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:18.949 "name": "Existed_Raid", 00:11:18.949 "uuid": "faae989e-a3c8-46ce-afce-60691b7fcd4f", 00:11:18.949 "strip_size_kb": 0, 00:11:18.949 "state": "online", 00:11:18.949 "raid_level": "raid1", 00:11:18.949 "superblock": true, 00:11:18.949 "num_base_bdevs": 2, 00:11:18.949 "num_base_bdevs_discovered": 1, 00:11:18.949 "num_base_bdevs_operational": 1, 00:11:18.949 "base_bdevs_list": [ 00:11:18.949 { 00:11:18.949 "name": null, 00:11:18.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.949 "is_configured": false, 00:11:18.949 "data_offset": 2048, 00:11:18.949 "data_size": 63488 00:11:18.949 }, 00:11:18.949 { 00:11:18.949 "name": "BaseBdev2", 00:11:18.949 "uuid": "6bdf9a0e-d0e0-469f-9778-098e1d84b806", 00:11:18.949 "is_configured": true, 00:11:18.949 "data_offset": 2048, 00:11:18.949 "data_size": 63488 00:11:18.949 } 00:11:18.949 ] 00:11:18.949 }' 00:11:18.949 11:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:18.949 11:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.515 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:19.515 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:19.515 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.515 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:19.772 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:19.772 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:19.772 11:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:20.029 [2024-05-14 11:47:46.985264] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:20.029 [2024-05-14 11:47:46.985337] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:20.029 [2024-05-14 11:47:46.996742] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:20.029 [2024-05-14 11:47:46.996808] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:20.029 [2024-05-14 11:47:46.996821] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac6ff0 name Existed_Raid, state offline 00:11:20.029 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:20.029 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:20.029 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.029 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1674761 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1674761 ']' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1674761 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1674761 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1674761' 00:11:20.286 killing process with pid 1674761 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1674761 00:11:20.286 [2024-05-14 11:47:47.302650] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:20.286 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1674761 00:11:20.286 [2024-05-14 11:47:47.303632] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:20.544 11:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:11:20.544 00:11:20.544 real 0m10.214s 00:11:20.544 user 0m18.160s 00:11:20.544 sys 0m1.891s 00:11:20.544 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:20.544 11:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:20.544 ************************************ 00:11:20.544 END TEST raid_state_function_test_sb 00:11:20.544 ************************************ 00:11:20.544 11:47:47 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:11:20.544 11:47:47 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:11:20.544 11:47:47 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:20.544 11:47:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:20.544 ************************************ 00:11:20.544 START TEST raid_superblock_test 00:11:20.544 ************************************ 00:11:20.544 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:11:20.544 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:11:20.544 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1676390 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1676390 /var/tmp/spdk-raid.sock 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1676390 ']' 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:20.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:20.545 11:47:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.817 [2024-05-14 11:47:47.677188] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:11:20.817 [2024-05-14 11:47:47.677264] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1676390 ] 00:11:20.817 [2024-05-14 11:47:47.806527] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:21.085 [2024-05-14 11:47:47.904697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:21.086 [2024-05-14 11:47:47.970656] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.086 [2024-05-14 11:47:47.970696] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:21.653 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:21.912 malloc1 00:11:21.912 11:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:22.170 [2024-05-14 11:47:49.084469] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:22.170 [2024-05-14 11:47:49.084517] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.170 [2024-05-14 11:47:49.084540] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19112a0 00:11:22.170 [2024-05-14 11:47:49.084553] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.170 [2024-05-14 11:47:49.086123] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.170 [2024-05-14 11:47:49.086150] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:22.170 pt1 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:22.170 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:22.426 malloc2 00:11:22.426 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:22.689 [2024-05-14 11:47:49.582579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:22.689 [2024-05-14 11:47:49.582624] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.689 [2024-05-14 11:47:49.582647] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac4480 00:11:22.689 [2024-05-14 11:47:49.582660] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.689 [2024-05-14 11:47:49.584034] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.689 [2024-05-14 11:47:49.584060] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:22.689 pt2 00:11:22.689 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:22.689 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:22.689 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:22.953 [2024-05-14 11:47:49.827253] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:22.953 [2024-05-14 11:47:49.828445] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:22.953 [2024-05-14 11:47:49.828586] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aba2c0 00:11:22.953 [2024-05-14 11:47:49.828600] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:22.953 [2024-05-14 11:47:49.828777] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1910f70 00:11:22.953 [2024-05-14 11:47:49.828921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aba2c0 00:11:22.953 [2024-05-14 11:47:49.828931] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1aba2c0 00:11:22.953 [2024-05-14 11:47:49.829021] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.953 11:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:23.212 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:23.212 "name": "raid_bdev1", 00:11:23.212 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:23.212 "strip_size_kb": 0, 00:11:23.212 "state": "online", 00:11:23.212 "raid_level": "raid1", 00:11:23.212 "superblock": true, 00:11:23.212 "num_base_bdevs": 2, 00:11:23.212 "num_base_bdevs_discovered": 2, 00:11:23.212 "num_base_bdevs_operational": 2, 00:11:23.212 "base_bdevs_list": [ 00:11:23.212 { 00:11:23.212 "name": "pt1", 00:11:23.212 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:23.212 "is_configured": true, 00:11:23.212 "data_offset": 2048, 00:11:23.212 "data_size": 63488 00:11:23.212 }, 00:11:23.212 { 00:11:23.212 "name": "pt2", 00:11:23.212 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:23.212 "is_configured": true, 00:11:23.212 "data_offset": 2048, 00:11:23.212 "data_size": 63488 00:11:23.212 } 00:11:23.212 ] 00:11:23.212 }' 00:11:23.212 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:23.212 11:47:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:23.774 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:24.031 [2024-05-14 11:47:50.918366] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:24.031 "name": "raid_bdev1", 00:11:24.031 "aliases": [ 00:11:24.031 "d07c3dea-c706-4c53-bb5f-9f44300ed280" 00:11:24.031 ], 00:11:24.031 "product_name": "Raid Volume", 00:11:24.031 "block_size": 512, 00:11:24.031 "num_blocks": 63488, 00:11:24.031 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:24.031 "assigned_rate_limits": { 00:11:24.031 "rw_ios_per_sec": 0, 00:11:24.031 "rw_mbytes_per_sec": 0, 00:11:24.031 "r_mbytes_per_sec": 0, 00:11:24.031 "w_mbytes_per_sec": 0 00:11:24.031 }, 00:11:24.031 "claimed": false, 00:11:24.031 "zoned": false, 00:11:24.031 "supported_io_types": { 00:11:24.031 "read": true, 00:11:24.031 "write": true, 00:11:24.031 "unmap": false, 00:11:24.031 "write_zeroes": true, 00:11:24.031 "flush": false, 00:11:24.031 "reset": true, 00:11:24.031 "compare": false, 00:11:24.031 "compare_and_write": false, 00:11:24.031 "abort": false, 00:11:24.031 "nvme_admin": false, 00:11:24.031 "nvme_io": false 00:11:24.031 }, 00:11:24.031 "memory_domains": [ 00:11:24.031 { 00:11:24.031 "dma_device_id": "system", 00:11:24.031 "dma_device_type": 1 00:11:24.031 }, 00:11:24.031 { 00:11:24.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.031 "dma_device_type": 2 00:11:24.031 }, 00:11:24.031 { 00:11:24.031 "dma_device_id": "system", 00:11:24.031 "dma_device_type": 1 00:11:24.031 }, 00:11:24.031 { 00:11:24.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.031 "dma_device_type": 2 00:11:24.031 } 00:11:24.031 ], 00:11:24.031 "driver_specific": { 00:11:24.031 "raid": { 00:11:24.031 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:24.031 "strip_size_kb": 0, 00:11:24.031 "state": "online", 00:11:24.031 "raid_level": "raid1", 00:11:24.031 "superblock": true, 00:11:24.031 "num_base_bdevs": 2, 00:11:24.031 "num_base_bdevs_discovered": 2, 00:11:24.031 "num_base_bdevs_operational": 2, 00:11:24.031 "base_bdevs_list": [ 00:11:24.031 { 00:11:24.031 "name": "pt1", 00:11:24.031 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:24.031 "is_configured": true, 00:11:24.031 "data_offset": 2048, 00:11:24.031 "data_size": 63488 00:11:24.031 }, 00:11:24.031 { 00:11:24.031 "name": "pt2", 00:11:24.031 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:24.031 "is_configured": true, 00:11:24.031 "data_offset": 2048, 00:11:24.031 "data_size": 63488 00:11:24.031 } 00:11:24.031 ] 00:11:24.031 } 00:11:24.031 } 00:11:24.031 }' 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:24.031 pt2' 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:24.031 11:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:24.289 "name": "pt1", 00:11:24.289 "aliases": [ 00:11:24.289 "4a9690ad-7355-555f-98bb-35549659946e" 00:11:24.289 ], 00:11:24.289 "product_name": "passthru", 00:11:24.289 "block_size": 512, 00:11:24.289 "num_blocks": 65536, 00:11:24.289 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:24.289 "assigned_rate_limits": { 00:11:24.289 "rw_ios_per_sec": 0, 00:11:24.289 "rw_mbytes_per_sec": 0, 00:11:24.289 "r_mbytes_per_sec": 0, 00:11:24.289 "w_mbytes_per_sec": 0 00:11:24.289 }, 00:11:24.289 "claimed": true, 00:11:24.289 "claim_type": "exclusive_write", 00:11:24.289 "zoned": false, 00:11:24.289 "supported_io_types": { 00:11:24.289 "read": true, 00:11:24.289 "write": true, 00:11:24.289 "unmap": true, 00:11:24.289 "write_zeroes": true, 00:11:24.289 "flush": true, 00:11:24.289 "reset": true, 00:11:24.289 "compare": false, 00:11:24.289 "compare_and_write": false, 00:11:24.289 "abort": true, 00:11:24.289 "nvme_admin": false, 00:11:24.289 "nvme_io": false 00:11:24.289 }, 00:11:24.289 "memory_domains": [ 00:11:24.289 { 00:11:24.289 "dma_device_id": "system", 00:11:24.289 "dma_device_type": 1 00:11:24.289 }, 00:11:24.289 { 00:11:24.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.289 "dma_device_type": 2 00:11:24.289 } 00:11:24.289 ], 00:11:24.289 "driver_specific": { 00:11:24.289 "passthru": { 00:11:24.289 "name": "pt1", 00:11:24.289 "base_bdev_name": "malloc1" 00:11:24.289 } 00:11:24.289 } 00:11:24.289 }' 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:24.289 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:24.546 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:24.803 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:24.803 "name": "pt2", 00:11:24.803 "aliases": [ 00:11:24.803 "31f35a5c-2e93-5d06-8fe1-dc009c550906" 00:11:24.803 ], 00:11:24.803 "product_name": "passthru", 00:11:24.803 "block_size": 512, 00:11:24.803 "num_blocks": 65536, 00:11:24.803 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:24.803 "assigned_rate_limits": { 00:11:24.803 "rw_ios_per_sec": 0, 00:11:24.803 "rw_mbytes_per_sec": 0, 00:11:24.803 "r_mbytes_per_sec": 0, 00:11:24.803 "w_mbytes_per_sec": 0 00:11:24.803 }, 00:11:24.803 "claimed": true, 00:11:24.803 "claim_type": "exclusive_write", 00:11:24.803 "zoned": false, 00:11:24.803 "supported_io_types": { 00:11:24.803 "read": true, 00:11:24.803 "write": true, 00:11:24.803 "unmap": true, 00:11:24.803 "write_zeroes": true, 00:11:24.803 "flush": true, 00:11:24.803 "reset": true, 00:11:24.803 "compare": false, 00:11:24.803 "compare_and_write": false, 00:11:24.803 "abort": true, 00:11:24.803 "nvme_admin": false, 00:11:24.803 "nvme_io": false 00:11:24.803 }, 00:11:24.803 "memory_domains": [ 00:11:24.803 { 00:11:24.803 "dma_device_id": "system", 00:11:24.803 "dma_device_type": 1 00:11:24.803 }, 00:11:24.803 { 00:11:24.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.803 "dma_device_type": 2 00:11:24.803 } 00:11:24.803 ], 00:11:24.803 "driver_specific": { 00:11:24.803 "passthru": { 00:11:24.803 "name": "pt2", 00:11:24.803 "base_bdev_name": "malloc2" 00:11:24.803 } 00:11:24.803 } 00:11:24.803 }' 00:11:24.803 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:24.803 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:25.061 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:25.061 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:25.061 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:25.061 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:25.061 11:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:25.061 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:25.061 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:25.061 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:25.061 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:25.318 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:25.318 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:25.318 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:11:25.575 [2024-05-14 11:47:52.410282] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:25.575 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=d07c3dea-c706-4c53-bb5f-9f44300ed280 00:11:25.575 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z d07c3dea-c706-4c53-bb5f-9f44300ed280 ']' 00:11:25.575 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:25.575 [2024-05-14 11:47:52.654706] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:25.575 [2024-05-14 11:47:52.654728] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:25.575 [2024-05-14 11:47:52.654784] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:25.575 [2024-05-14 11:47:52.654839] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:25.575 [2024-05-14 11:47:52.654850] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aba2c0 name raid_bdev1, state offline 00:11:25.833 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.833 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:11:26.091 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:11:26.091 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:11:26.091 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:26.091 11:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:26.091 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:26.091 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:26.349 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:26.349 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:26.606 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:26.607 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:26.864 [2024-05-14 11:47:53.865889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:26.864 [2024-05-14 11:47:53.867246] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:26.864 [2024-05-14 11:47:53.867305] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:26.864 [2024-05-14 11:47:53.867345] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:26.865 [2024-05-14 11:47:53.867364] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:26.865 [2024-05-14 11:47:53.867374] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aba000 name raid_bdev1, state configuring 00:11:26.865 request: 00:11:26.865 { 00:11:26.865 "name": "raid_bdev1", 00:11:26.865 "raid_level": "raid1", 00:11:26.865 "base_bdevs": [ 00:11:26.865 "malloc1", 00:11:26.865 "malloc2" 00:11:26.865 ], 00:11:26.865 "superblock": false, 00:11:26.865 "method": "bdev_raid_create", 00:11:26.865 "req_id": 1 00:11:26.865 } 00:11:26.865 Got JSON-RPC error response 00:11:26.865 response: 00:11:26.865 { 00:11:26.865 "code": -17, 00:11:26.865 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:26.865 } 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.865 11:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:11:27.122 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:11:27.122 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:11:27.122 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:27.379 [2024-05-14 11:47:54.351103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:27.379 [2024-05-14 11:47:54.351154] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:27.379 [2024-05-14 11:47:54.351181] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1912c40 00:11:27.379 [2024-05-14 11:47:54.351194] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:27.379 [2024-05-14 11:47:54.352880] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:27.379 [2024-05-14 11:47:54.352911] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:27.379 [2024-05-14 11:47:54.352983] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:11:27.379 [2024-05-14 11:47:54.353012] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:27.379 pt1 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.379 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:27.656 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:27.656 "name": "raid_bdev1", 00:11:27.656 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:27.656 "strip_size_kb": 0, 00:11:27.656 "state": "configuring", 00:11:27.656 "raid_level": "raid1", 00:11:27.656 "superblock": true, 00:11:27.656 "num_base_bdevs": 2, 00:11:27.656 "num_base_bdevs_discovered": 1, 00:11:27.656 "num_base_bdevs_operational": 2, 00:11:27.656 "base_bdevs_list": [ 00:11:27.656 { 00:11:27.656 "name": "pt1", 00:11:27.656 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:27.656 "is_configured": true, 00:11:27.656 "data_offset": 2048, 00:11:27.656 "data_size": 63488 00:11:27.656 }, 00:11:27.656 { 00:11:27.656 "name": null, 00:11:27.656 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:27.656 "is_configured": false, 00:11:27.656 "data_offset": 2048, 00:11:27.656 "data_size": 63488 00:11:27.656 } 00:11:27.656 ] 00:11:27.656 }' 00:11:27.656 11:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:27.656 11:47:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.223 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:11:28.223 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:11:28.223 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:28.223 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:28.481 [2024-05-14 11:47:55.417941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:28.481 [2024-05-14 11:47:55.417992] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:28.481 [2024-05-14 11:47:55.418015] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x190cf30 00:11:28.481 [2024-05-14 11:47:55.418028] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:28.481 [2024-05-14 11:47:55.418385] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:28.481 [2024-05-14 11:47:55.418413] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:28.481 [2024-05-14 11:47:55.418477] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:11:28.481 [2024-05-14 11:47:55.418498] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:28.481 [2024-05-14 11:47:55.418597] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1907bd0 00:11:28.481 [2024-05-14 11:47:55.418607] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:28.481 [2024-05-14 11:47:55.418776] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19121d0 00:11:28.481 [2024-05-14 11:47:55.418904] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1907bd0 00:11:28.481 [2024-05-14 11:47:55.418914] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1907bd0 00:11:28.481 [2024-05-14 11:47:55.419014] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:28.481 pt2 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.481 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:28.739 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:28.739 "name": "raid_bdev1", 00:11:28.739 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:28.739 "strip_size_kb": 0, 00:11:28.739 "state": "online", 00:11:28.739 "raid_level": "raid1", 00:11:28.739 "superblock": true, 00:11:28.739 "num_base_bdevs": 2, 00:11:28.739 "num_base_bdevs_discovered": 2, 00:11:28.739 "num_base_bdevs_operational": 2, 00:11:28.739 "base_bdevs_list": [ 00:11:28.739 { 00:11:28.739 "name": "pt1", 00:11:28.739 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:28.739 "is_configured": true, 00:11:28.739 "data_offset": 2048, 00:11:28.739 "data_size": 63488 00:11:28.739 }, 00:11:28.739 { 00:11:28.739 "name": "pt2", 00:11:28.739 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:28.739 "is_configured": true, 00:11:28.739 "data_offset": 2048, 00:11:28.739 "data_size": 63488 00:11:28.739 } 00:11:28.739 ] 00:11:28.739 }' 00:11:28.739 11:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:28.739 11:47:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:29.304 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:29.563 [2024-05-14 11:47:56.485010] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:29.563 "name": "raid_bdev1", 00:11:29.563 "aliases": [ 00:11:29.563 "d07c3dea-c706-4c53-bb5f-9f44300ed280" 00:11:29.563 ], 00:11:29.563 "product_name": "Raid Volume", 00:11:29.563 "block_size": 512, 00:11:29.563 "num_blocks": 63488, 00:11:29.563 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:29.563 "assigned_rate_limits": { 00:11:29.563 "rw_ios_per_sec": 0, 00:11:29.563 "rw_mbytes_per_sec": 0, 00:11:29.563 "r_mbytes_per_sec": 0, 00:11:29.563 "w_mbytes_per_sec": 0 00:11:29.563 }, 00:11:29.563 "claimed": false, 00:11:29.563 "zoned": false, 00:11:29.563 "supported_io_types": { 00:11:29.563 "read": true, 00:11:29.563 "write": true, 00:11:29.563 "unmap": false, 00:11:29.563 "write_zeroes": true, 00:11:29.563 "flush": false, 00:11:29.563 "reset": true, 00:11:29.563 "compare": false, 00:11:29.563 "compare_and_write": false, 00:11:29.563 "abort": false, 00:11:29.563 "nvme_admin": false, 00:11:29.563 "nvme_io": false 00:11:29.563 }, 00:11:29.563 "memory_domains": [ 00:11:29.563 { 00:11:29.563 "dma_device_id": "system", 00:11:29.563 "dma_device_type": 1 00:11:29.563 }, 00:11:29.563 { 00:11:29.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.563 "dma_device_type": 2 00:11:29.563 }, 00:11:29.563 { 00:11:29.563 "dma_device_id": "system", 00:11:29.563 "dma_device_type": 1 00:11:29.563 }, 00:11:29.563 { 00:11:29.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.563 "dma_device_type": 2 00:11:29.563 } 00:11:29.563 ], 00:11:29.563 "driver_specific": { 00:11:29.563 "raid": { 00:11:29.563 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:29.563 "strip_size_kb": 0, 00:11:29.563 "state": "online", 00:11:29.563 "raid_level": "raid1", 00:11:29.563 "superblock": true, 00:11:29.563 "num_base_bdevs": 2, 00:11:29.563 "num_base_bdevs_discovered": 2, 00:11:29.563 "num_base_bdevs_operational": 2, 00:11:29.563 "base_bdevs_list": [ 00:11:29.563 { 00:11:29.563 "name": "pt1", 00:11:29.563 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:29.563 "is_configured": true, 00:11:29.563 "data_offset": 2048, 00:11:29.563 "data_size": 63488 00:11:29.563 }, 00:11:29.563 { 00:11:29.563 "name": "pt2", 00:11:29.563 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:29.563 "is_configured": true, 00:11:29.563 "data_offset": 2048, 00:11:29.563 "data_size": 63488 00:11:29.563 } 00:11:29.563 ] 00:11:29.563 } 00:11:29.563 } 00:11:29.563 }' 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:29.563 pt2' 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:29.563 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:29.822 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:29.822 "name": "pt1", 00:11:29.822 "aliases": [ 00:11:29.822 "4a9690ad-7355-555f-98bb-35549659946e" 00:11:29.822 ], 00:11:29.822 "product_name": "passthru", 00:11:29.822 "block_size": 512, 00:11:29.822 "num_blocks": 65536, 00:11:29.822 "uuid": "4a9690ad-7355-555f-98bb-35549659946e", 00:11:29.822 "assigned_rate_limits": { 00:11:29.822 "rw_ios_per_sec": 0, 00:11:29.822 "rw_mbytes_per_sec": 0, 00:11:29.822 "r_mbytes_per_sec": 0, 00:11:29.822 "w_mbytes_per_sec": 0 00:11:29.822 }, 00:11:29.822 "claimed": true, 00:11:29.822 "claim_type": "exclusive_write", 00:11:29.822 "zoned": false, 00:11:29.822 "supported_io_types": { 00:11:29.822 "read": true, 00:11:29.822 "write": true, 00:11:29.822 "unmap": true, 00:11:29.822 "write_zeroes": true, 00:11:29.822 "flush": true, 00:11:29.822 "reset": true, 00:11:29.822 "compare": false, 00:11:29.822 "compare_and_write": false, 00:11:29.822 "abort": true, 00:11:29.822 "nvme_admin": false, 00:11:29.822 "nvme_io": false 00:11:29.822 }, 00:11:29.822 "memory_domains": [ 00:11:29.822 { 00:11:29.822 "dma_device_id": "system", 00:11:29.822 "dma_device_type": 1 00:11:29.822 }, 00:11:29.822 { 00:11:29.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.822 "dma_device_type": 2 00:11:29.822 } 00:11:29.822 ], 00:11:29.822 "driver_specific": { 00:11:29.822 "passthru": { 00:11:29.822 "name": "pt1", 00:11:29.822 "base_bdev_name": "malloc1" 00:11:29.822 } 00:11:29.822 } 00:11:29.822 }' 00:11:29.822 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:29.822 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:29.822 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:29.822 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:30.080 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:30.080 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:30.080 11:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:30.080 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:30.338 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:30.338 "name": "pt2", 00:11:30.338 "aliases": [ 00:11:30.338 "31f35a5c-2e93-5d06-8fe1-dc009c550906" 00:11:30.338 ], 00:11:30.338 "product_name": "passthru", 00:11:30.338 "block_size": 512, 00:11:30.338 "num_blocks": 65536, 00:11:30.338 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:30.338 "assigned_rate_limits": { 00:11:30.338 "rw_ios_per_sec": 0, 00:11:30.338 "rw_mbytes_per_sec": 0, 00:11:30.338 "r_mbytes_per_sec": 0, 00:11:30.338 "w_mbytes_per_sec": 0 00:11:30.338 }, 00:11:30.338 "claimed": true, 00:11:30.338 "claim_type": "exclusive_write", 00:11:30.338 "zoned": false, 00:11:30.338 "supported_io_types": { 00:11:30.338 "read": true, 00:11:30.338 "write": true, 00:11:30.338 "unmap": true, 00:11:30.338 "write_zeroes": true, 00:11:30.338 "flush": true, 00:11:30.338 "reset": true, 00:11:30.338 "compare": false, 00:11:30.338 "compare_and_write": false, 00:11:30.338 "abort": true, 00:11:30.338 "nvme_admin": false, 00:11:30.338 "nvme_io": false 00:11:30.338 }, 00:11:30.338 "memory_domains": [ 00:11:30.338 { 00:11:30.338 "dma_device_id": "system", 00:11:30.338 "dma_device_type": 1 00:11:30.338 }, 00:11:30.338 { 00:11:30.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.338 "dma_device_type": 2 00:11:30.338 } 00:11:30.338 ], 00:11:30.338 "driver_specific": { 00:11:30.338 "passthru": { 00:11:30.338 "name": "pt2", 00:11:30.338 "base_bdev_name": "malloc2" 00:11:30.338 } 00:11:30.338 } 00:11:30.338 }' 00:11:30.339 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:30.597 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:30.856 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:30.856 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:30.856 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:30.856 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:11:31.115 [2024-05-14 11:47:57.969034] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:31.115 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' d07c3dea-c706-4c53-bb5f-9f44300ed280 '!=' d07c3dea-c706-4c53-bb5f-9f44300ed280 ']' 00:11:31.115 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:11:31.115 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:31.115 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:11:31.115 11:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:31.374 [2024-05-14 11:47:58.213497] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.374 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.633 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:31.633 "name": "raid_bdev1", 00:11:31.633 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:31.633 "strip_size_kb": 0, 00:11:31.633 "state": "online", 00:11:31.633 "raid_level": "raid1", 00:11:31.633 "superblock": true, 00:11:31.633 "num_base_bdevs": 2, 00:11:31.633 "num_base_bdevs_discovered": 1, 00:11:31.633 "num_base_bdevs_operational": 1, 00:11:31.633 "base_bdevs_list": [ 00:11:31.633 { 00:11:31.633 "name": null, 00:11:31.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.633 "is_configured": false, 00:11:31.633 "data_offset": 2048, 00:11:31.633 "data_size": 63488 00:11:31.633 }, 00:11:31.633 { 00:11:31.633 "name": "pt2", 00:11:31.633 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:31.633 "is_configured": true, 00:11:31.633 "data_offset": 2048, 00:11:31.633 "data_size": 63488 00:11:31.633 } 00:11:31.633 ] 00:11:31.633 }' 00:11:31.633 11:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:31.633 11:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.199 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:32.457 [2024-05-14 11:47:59.288305] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:32.457 [2024-05-14 11:47:59.288334] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:32.457 [2024-05-14 11:47:59.288393] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:32.457 [2024-05-14 11:47:59.288453] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:32.457 [2024-05-14 11:47:59.288465] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1907bd0 name raid_bdev1, state offline 00:11:32.457 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.457 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=1 00:11:32.715 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:32.974 [2024-05-14 11:47:59.978092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:32.974 [2024-05-14 11:47:59.978137] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:32.974 [2024-05-14 11:47:59.978159] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x190d160 00:11:32.974 [2024-05-14 11:47:59.978172] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:32.974 [2024-05-14 11:47:59.979787] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:32.974 [2024-05-14 11:47:59.979814] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:32.974 [2024-05-14 11:47:59.979881] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:11:32.974 [2024-05-14 11:47:59.979908] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:32.974 [2024-05-14 11:47:59.979992] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x190b900 00:11:32.974 [2024-05-14 11:47:59.980003] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:32.974 [2024-05-14 11:47:59.980172] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1912250 00:11:32.974 [2024-05-14 11:47:59.980292] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x190b900 00:11:32.974 [2024-05-14 11:47:59.980303] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x190b900 00:11:32.974 [2024-05-14 11:47:59.980408] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:32.974 pt2 00:11:32.974 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:32.974 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:32.974 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:32.974 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:11:32.974 11:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.974 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:33.233 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:33.233 "name": "raid_bdev1", 00:11:33.233 "uuid": "d07c3dea-c706-4c53-bb5f-9f44300ed280", 00:11:33.233 "strip_size_kb": 0, 00:11:33.233 "state": "online", 00:11:33.233 "raid_level": "raid1", 00:11:33.233 "superblock": true, 00:11:33.233 "num_base_bdevs": 2, 00:11:33.233 "num_base_bdevs_discovered": 1, 00:11:33.233 "num_base_bdevs_operational": 1, 00:11:33.233 "base_bdevs_list": [ 00:11:33.233 { 00:11:33.233 "name": null, 00:11:33.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.233 "is_configured": false, 00:11:33.233 "data_offset": 2048, 00:11:33.233 "data_size": 63488 00:11:33.233 }, 00:11:33.233 { 00:11:33.233 "name": "pt2", 00:11:33.233 "uuid": "31f35a5c-2e93-5d06-8fe1-dc009c550906", 00:11:33.233 "is_configured": true, 00:11:33.233 "data_offset": 2048, 00:11:33.233 "data_size": 63488 00:11:33.233 } 00:11:33.233 ] 00:11:33.233 }' 00:11:33.233 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:33.233 11:48:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.801 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:11:33.801 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:33.801 11:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:11:34.059 [2024-05-14 11:48:01.061170] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' d07c3dea-c706-4c53-bb5f-9f44300ed280 '!=' d07c3dea-c706-4c53-bb5f-9f44300ed280 ']' 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1676390 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1676390 ']' 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1676390 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1676390 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1676390' 00:11:34.059 killing process with pid 1676390 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1676390 00:11:34.059 [2024-05-14 11:48:01.130010] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:34.059 [2024-05-14 11:48:01.130077] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:34.059 [2024-05-14 11:48:01.130130] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:34.059 [2024-05-14 11:48:01.130144] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x190b900 name raid_bdev1, state offline 00:11:34.059 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1676390 00:11:34.318 [2024-05-14 11:48:01.147632] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.318 11:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:11:34.318 00:11:34.318 real 0m13.749s 00:11:34.318 user 0m24.780s 00:11:34.318 sys 0m2.601s 00:11:34.318 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:34.318 11:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.318 ************************************ 00:11:34.318 END TEST raid_superblock_test 00:11:34.318 ************************************ 00:11:34.577 11:48:01 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:11:34.577 11:48:01 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:11:34.577 11:48:01 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:11:34.577 11:48:01 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:34.577 11:48:01 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:34.577 11:48:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.577 ************************************ 00:11:34.577 START TEST raid_state_function_test 00:11:34.577 ************************************ 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 false 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1678548 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1678548' 00:11:34.577 Process raid pid: 1678548 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1678548 /var/tmp/spdk-raid.sock 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1678548 ']' 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:34.577 11:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.577 [2024-05-14 11:48:01.522508] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:11:34.577 [2024-05-14 11:48:01.522571] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:34.578 [2024-05-14 11:48:01.655630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:34.836 [2024-05-14 11:48:01.762176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.836 [2024-05-14 11:48:01.828112] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.836 [2024-05-14 11:48:01.828150] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.414 11:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:35.414 11:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:11:35.414 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:35.678 [2024-05-14 11:48:02.671588] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:35.678 [2024-05-14 11:48:02.671631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:35.678 [2024-05-14 11:48:02.671642] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:35.678 [2024-05-14 11:48:02.671654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:35.678 [2024-05-14 11:48:02.671663] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:35.678 [2024-05-14 11:48:02.671673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.678 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:35.936 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:35.936 "name": "Existed_Raid", 00:11:35.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.936 "strip_size_kb": 64, 00:11:35.936 "state": "configuring", 00:11:35.936 "raid_level": "raid0", 00:11:35.936 "superblock": false, 00:11:35.936 "num_base_bdevs": 3, 00:11:35.936 "num_base_bdevs_discovered": 0, 00:11:35.936 "num_base_bdevs_operational": 3, 00:11:35.936 "base_bdevs_list": [ 00:11:35.936 { 00:11:35.936 "name": "BaseBdev1", 00:11:35.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.936 "is_configured": false, 00:11:35.936 "data_offset": 0, 00:11:35.936 "data_size": 0 00:11:35.936 }, 00:11:35.936 { 00:11:35.936 "name": "BaseBdev2", 00:11:35.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.936 "is_configured": false, 00:11:35.936 "data_offset": 0, 00:11:35.936 "data_size": 0 00:11:35.936 }, 00:11:35.936 { 00:11:35.936 "name": "BaseBdev3", 00:11:35.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.936 "is_configured": false, 00:11:35.936 "data_offset": 0, 00:11:35.936 "data_size": 0 00:11:35.936 } 00:11:35.936 ] 00:11:35.936 }' 00:11:35.936 11:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:35.936 11:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.504 11:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:36.792 [2024-05-14 11:48:03.750336] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:36.792 [2024-05-14 11:48:03.750367] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258a700 name Existed_Raid, state configuring 00:11:36.792 11:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:37.060 [2024-05-14 11:48:03.986977] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:37.060 [2024-05-14 11:48:03.987006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:37.060 [2024-05-14 11:48:03.987016] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.060 [2024-05-14 11:48:03.987027] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.060 [2024-05-14 11:48:03.987036] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:37.060 [2024-05-14 11:48:03.987047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:37.060 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:37.317 [2024-05-14 11:48:04.245611] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:37.317 BaseBdev1 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:37.317 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:37.318 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:37.575 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:37.832 [ 00:11:37.832 { 00:11:37.832 "name": "BaseBdev1", 00:11:37.832 "aliases": [ 00:11:37.832 "85d2cd89-3b43-408e-afa2-609b69ec2d9e" 00:11:37.832 ], 00:11:37.832 "product_name": "Malloc disk", 00:11:37.832 "block_size": 512, 00:11:37.832 "num_blocks": 65536, 00:11:37.832 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:37.832 "assigned_rate_limits": { 00:11:37.832 "rw_ios_per_sec": 0, 00:11:37.832 "rw_mbytes_per_sec": 0, 00:11:37.832 "r_mbytes_per_sec": 0, 00:11:37.832 "w_mbytes_per_sec": 0 00:11:37.832 }, 00:11:37.832 "claimed": true, 00:11:37.832 "claim_type": "exclusive_write", 00:11:37.832 "zoned": false, 00:11:37.832 "supported_io_types": { 00:11:37.832 "read": true, 00:11:37.832 "write": true, 00:11:37.832 "unmap": true, 00:11:37.832 "write_zeroes": true, 00:11:37.832 "flush": true, 00:11:37.833 "reset": true, 00:11:37.833 "compare": false, 00:11:37.833 "compare_and_write": false, 00:11:37.833 "abort": true, 00:11:37.833 "nvme_admin": false, 00:11:37.833 "nvme_io": false 00:11:37.833 }, 00:11:37.833 "memory_domains": [ 00:11:37.833 { 00:11:37.833 "dma_device_id": "system", 00:11:37.833 "dma_device_type": 1 00:11:37.833 }, 00:11:37.833 { 00:11:37.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.833 "dma_device_type": 2 00:11:37.833 } 00:11:37.833 ], 00:11:37.833 "driver_specific": {} 00:11:37.833 } 00:11:37.833 ] 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.833 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.091 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:38.091 "name": "Existed_Raid", 00:11:38.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.091 "strip_size_kb": 64, 00:11:38.091 "state": "configuring", 00:11:38.091 "raid_level": "raid0", 00:11:38.091 "superblock": false, 00:11:38.091 "num_base_bdevs": 3, 00:11:38.091 "num_base_bdevs_discovered": 1, 00:11:38.091 "num_base_bdevs_operational": 3, 00:11:38.091 "base_bdevs_list": [ 00:11:38.091 { 00:11:38.091 "name": "BaseBdev1", 00:11:38.091 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:38.091 "is_configured": true, 00:11:38.091 "data_offset": 0, 00:11:38.091 "data_size": 65536 00:11:38.091 }, 00:11:38.091 { 00:11:38.091 "name": "BaseBdev2", 00:11:38.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.091 "is_configured": false, 00:11:38.091 "data_offset": 0, 00:11:38.091 "data_size": 0 00:11:38.091 }, 00:11:38.091 { 00:11:38.091 "name": "BaseBdev3", 00:11:38.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.091 "is_configured": false, 00:11:38.091 "data_offset": 0, 00:11:38.091 "data_size": 0 00:11:38.091 } 00:11:38.091 ] 00:11:38.091 }' 00:11:38.091 11:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:38.091 11:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.657 11:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:38.913 [2024-05-14 11:48:05.745584] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:38.913 [2024-05-14 11:48:05.745626] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2589ff0 name Existed_Raid, state configuring 00:11:38.913 11:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:38.913 [2024-05-14 11:48:05.990264] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:38.913 [2024-05-14 11:48:05.991737] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:38.913 [2024-05-14 11:48:05.991768] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:38.913 [2024-05-14 11:48:05.991778] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:38.913 [2024-05-14 11:48:05.991790] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.171 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.429 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:39.429 "name": "Existed_Raid", 00:11:39.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.429 "strip_size_kb": 64, 00:11:39.429 "state": "configuring", 00:11:39.429 "raid_level": "raid0", 00:11:39.429 "superblock": false, 00:11:39.429 "num_base_bdevs": 3, 00:11:39.429 "num_base_bdevs_discovered": 1, 00:11:39.429 "num_base_bdevs_operational": 3, 00:11:39.429 "base_bdevs_list": [ 00:11:39.429 { 00:11:39.429 "name": "BaseBdev1", 00:11:39.429 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:39.429 "is_configured": true, 00:11:39.429 "data_offset": 0, 00:11:39.429 "data_size": 65536 00:11:39.429 }, 00:11:39.429 { 00:11:39.429 "name": "BaseBdev2", 00:11:39.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.429 "is_configured": false, 00:11:39.429 "data_offset": 0, 00:11:39.429 "data_size": 0 00:11:39.429 }, 00:11:39.429 { 00:11:39.429 "name": "BaseBdev3", 00:11:39.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.429 "is_configured": false, 00:11:39.429 "data_offset": 0, 00:11:39.429 "data_size": 0 00:11:39.429 } 00:11:39.429 ] 00:11:39.429 }' 00:11:39.429 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:39.429 11:48:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.996 11:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:39.996 [2024-05-14 11:48:07.072989] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:39.996 BaseBdev2 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:40.254 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:40.512 [ 00:11:40.512 { 00:11:40.512 "name": "BaseBdev2", 00:11:40.512 "aliases": [ 00:11:40.512 "6d2f9e5c-345e-4a2b-89d9-696524e1eb72" 00:11:40.512 ], 00:11:40.512 "product_name": "Malloc disk", 00:11:40.512 "block_size": 512, 00:11:40.512 "num_blocks": 65536, 00:11:40.512 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:40.512 "assigned_rate_limits": { 00:11:40.512 "rw_ios_per_sec": 0, 00:11:40.512 "rw_mbytes_per_sec": 0, 00:11:40.512 "r_mbytes_per_sec": 0, 00:11:40.512 "w_mbytes_per_sec": 0 00:11:40.512 }, 00:11:40.512 "claimed": true, 00:11:40.512 "claim_type": "exclusive_write", 00:11:40.512 "zoned": false, 00:11:40.512 "supported_io_types": { 00:11:40.512 "read": true, 00:11:40.512 "write": true, 00:11:40.512 "unmap": true, 00:11:40.512 "write_zeroes": true, 00:11:40.512 "flush": true, 00:11:40.512 "reset": true, 00:11:40.512 "compare": false, 00:11:40.512 "compare_and_write": false, 00:11:40.512 "abort": true, 00:11:40.512 "nvme_admin": false, 00:11:40.512 "nvme_io": false 00:11:40.512 }, 00:11:40.512 "memory_domains": [ 00:11:40.512 { 00:11:40.512 "dma_device_id": "system", 00:11:40.512 "dma_device_type": 1 00:11:40.512 }, 00:11:40.512 { 00:11:40.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.512 "dma_device_type": 2 00:11:40.512 } 00:11:40.512 ], 00:11:40.512 "driver_specific": {} 00:11:40.512 } 00:11:40.512 ] 00:11:40.512 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:40.512 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:40.512 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.513 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.771 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:40.771 "name": "Existed_Raid", 00:11:40.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.771 "strip_size_kb": 64, 00:11:40.771 "state": "configuring", 00:11:40.771 "raid_level": "raid0", 00:11:40.771 "superblock": false, 00:11:40.771 "num_base_bdevs": 3, 00:11:40.771 "num_base_bdevs_discovered": 2, 00:11:40.771 "num_base_bdevs_operational": 3, 00:11:40.771 "base_bdevs_list": [ 00:11:40.771 { 00:11:40.771 "name": "BaseBdev1", 00:11:40.771 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:40.771 "is_configured": true, 00:11:40.771 "data_offset": 0, 00:11:40.771 "data_size": 65536 00:11:40.771 }, 00:11:40.771 { 00:11:40.771 "name": "BaseBdev2", 00:11:40.771 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:40.771 "is_configured": true, 00:11:40.771 "data_offset": 0, 00:11:40.771 "data_size": 65536 00:11:40.771 }, 00:11:40.771 { 00:11:40.771 "name": "BaseBdev3", 00:11:40.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.771 "is_configured": false, 00:11:40.771 "data_offset": 0, 00:11:40.771 "data_size": 0 00:11:40.771 } 00:11:40.771 ] 00:11:40.771 }' 00:11:40.771 11:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:40.771 11:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.337 11:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:41.596 [2024-05-14 11:48:08.573553] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:41.596 [2024-05-14 11:48:08.573594] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x258b080 00:11:41.596 [2024-05-14 11:48:08.573602] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:41.596 [2024-05-14 11:48:08.573795] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258ad50 00:11:41.596 [2024-05-14 11:48:08.573928] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x258b080 00:11:41.596 [2024-05-14 11:48:08.573938] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x258b080 00:11:41.596 [2024-05-14 11:48:08.574101] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:41.596 BaseBdev3 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:41.596 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:41.854 11:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:42.114 [ 00:11:42.114 { 00:11:42.114 "name": "BaseBdev3", 00:11:42.114 "aliases": [ 00:11:42.114 "bcc3771d-e70b-4ce6-8365-70edcb60c340" 00:11:42.114 ], 00:11:42.114 "product_name": "Malloc disk", 00:11:42.114 "block_size": 512, 00:11:42.114 "num_blocks": 65536, 00:11:42.114 "uuid": "bcc3771d-e70b-4ce6-8365-70edcb60c340", 00:11:42.114 "assigned_rate_limits": { 00:11:42.114 "rw_ios_per_sec": 0, 00:11:42.114 "rw_mbytes_per_sec": 0, 00:11:42.114 "r_mbytes_per_sec": 0, 00:11:42.114 "w_mbytes_per_sec": 0 00:11:42.114 }, 00:11:42.114 "claimed": true, 00:11:42.114 "claim_type": "exclusive_write", 00:11:42.114 "zoned": false, 00:11:42.114 "supported_io_types": { 00:11:42.114 "read": true, 00:11:42.114 "write": true, 00:11:42.114 "unmap": true, 00:11:42.114 "write_zeroes": true, 00:11:42.114 "flush": true, 00:11:42.114 "reset": true, 00:11:42.114 "compare": false, 00:11:42.114 "compare_and_write": false, 00:11:42.114 "abort": true, 00:11:42.114 "nvme_admin": false, 00:11:42.114 "nvme_io": false 00:11:42.114 }, 00:11:42.114 "memory_domains": [ 00:11:42.114 { 00:11:42.114 "dma_device_id": "system", 00:11:42.114 "dma_device_type": 1 00:11:42.114 }, 00:11:42.114 { 00:11:42.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.114 "dma_device_type": 2 00:11:42.114 } 00:11:42.114 ], 00:11:42.114 "driver_specific": {} 00:11:42.114 } 00:11:42.114 ] 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.114 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.372 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:42.372 "name": "Existed_Raid", 00:11:42.372 "uuid": "8d89d6a1-0bea-4901-9e91-d43d28c9e7b7", 00:11:42.372 "strip_size_kb": 64, 00:11:42.372 "state": "online", 00:11:42.372 "raid_level": "raid0", 00:11:42.372 "superblock": false, 00:11:42.372 "num_base_bdevs": 3, 00:11:42.372 "num_base_bdevs_discovered": 3, 00:11:42.373 "num_base_bdevs_operational": 3, 00:11:42.373 "base_bdevs_list": [ 00:11:42.373 { 00:11:42.373 "name": "BaseBdev1", 00:11:42.373 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:42.373 "is_configured": true, 00:11:42.373 "data_offset": 0, 00:11:42.373 "data_size": 65536 00:11:42.373 }, 00:11:42.373 { 00:11:42.373 "name": "BaseBdev2", 00:11:42.373 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:42.373 "is_configured": true, 00:11:42.373 "data_offset": 0, 00:11:42.373 "data_size": 65536 00:11:42.373 }, 00:11:42.373 { 00:11:42.373 "name": "BaseBdev3", 00:11:42.373 "uuid": "bcc3771d-e70b-4ce6-8365-70edcb60c340", 00:11:42.373 "is_configured": true, 00:11:42.373 "data_offset": 0, 00:11:42.373 "data_size": 65536 00:11:42.373 } 00:11:42.373 ] 00:11:42.373 }' 00:11:42.373 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:42.373 11:48:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:42.941 11:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:43.200 [2024-05-14 11:48:10.089859] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:43.200 "name": "Existed_Raid", 00:11:43.200 "aliases": [ 00:11:43.200 "8d89d6a1-0bea-4901-9e91-d43d28c9e7b7" 00:11:43.200 ], 00:11:43.200 "product_name": "Raid Volume", 00:11:43.200 "block_size": 512, 00:11:43.200 "num_blocks": 196608, 00:11:43.200 "uuid": "8d89d6a1-0bea-4901-9e91-d43d28c9e7b7", 00:11:43.200 "assigned_rate_limits": { 00:11:43.200 "rw_ios_per_sec": 0, 00:11:43.200 "rw_mbytes_per_sec": 0, 00:11:43.200 "r_mbytes_per_sec": 0, 00:11:43.200 "w_mbytes_per_sec": 0 00:11:43.200 }, 00:11:43.200 "claimed": false, 00:11:43.200 "zoned": false, 00:11:43.200 "supported_io_types": { 00:11:43.200 "read": true, 00:11:43.200 "write": true, 00:11:43.200 "unmap": true, 00:11:43.200 "write_zeroes": true, 00:11:43.200 "flush": true, 00:11:43.200 "reset": true, 00:11:43.200 "compare": false, 00:11:43.200 "compare_and_write": false, 00:11:43.200 "abort": false, 00:11:43.200 "nvme_admin": false, 00:11:43.200 "nvme_io": false 00:11:43.200 }, 00:11:43.200 "memory_domains": [ 00:11:43.200 { 00:11:43.200 "dma_device_id": "system", 00:11:43.200 "dma_device_type": 1 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.200 "dma_device_type": 2 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "dma_device_id": "system", 00:11:43.200 "dma_device_type": 1 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.200 "dma_device_type": 2 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "dma_device_id": "system", 00:11:43.200 "dma_device_type": 1 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.200 "dma_device_type": 2 00:11:43.200 } 00:11:43.200 ], 00:11:43.200 "driver_specific": { 00:11:43.200 "raid": { 00:11:43.200 "uuid": "8d89d6a1-0bea-4901-9e91-d43d28c9e7b7", 00:11:43.200 "strip_size_kb": 64, 00:11:43.200 "state": "online", 00:11:43.200 "raid_level": "raid0", 00:11:43.200 "superblock": false, 00:11:43.200 "num_base_bdevs": 3, 00:11:43.200 "num_base_bdevs_discovered": 3, 00:11:43.200 "num_base_bdevs_operational": 3, 00:11:43.200 "base_bdevs_list": [ 00:11:43.200 { 00:11:43.200 "name": "BaseBdev1", 00:11:43.200 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:43.200 "is_configured": true, 00:11:43.200 "data_offset": 0, 00:11:43.200 "data_size": 65536 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "name": "BaseBdev2", 00:11:43.200 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:43.200 "is_configured": true, 00:11:43.200 "data_offset": 0, 00:11:43.200 "data_size": 65536 00:11:43.200 }, 00:11:43.200 { 00:11:43.200 "name": "BaseBdev3", 00:11:43.200 "uuid": "bcc3771d-e70b-4ce6-8365-70edcb60c340", 00:11:43.200 "is_configured": true, 00:11:43.200 "data_offset": 0, 00:11:43.200 "data_size": 65536 00:11:43.200 } 00:11:43.200 ] 00:11:43.200 } 00:11:43.200 } 00:11:43.200 }' 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:43.200 BaseBdev2 00:11:43.200 BaseBdev3' 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:43.200 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:43.459 "name": "BaseBdev1", 00:11:43.459 "aliases": [ 00:11:43.459 "85d2cd89-3b43-408e-afa2-609b69ec2d9e" 00:11:43.459 ], 00:11:43.459 "product_name": "Malloc disk", 00:11:43.459 "block_size": 512, 00:11:43.459 "num_blocks": 65536, 00:11:43.459 "uuid": "85d2cd89-3b43-408e-afa2-609b69ec2d9e", 00:11:43.459 "assigned_rate_limits": { 00:11:43.459 "rw_ios_per_sec": 0, 00:11:43.459 "rw_mbytes_per_sec": 0, 00:11:43.459 "r_mbytes_per_sec": 0, 00:11:43.459 "w_mbytes_per_sec": 0 00:11:43.459 }, 00:11:43.459 "claimed": true, 00:11:43.459 "claim_type": "exclusive_write", 00:11:43.459 "zoned": false, 00:11:43.459 "supported_io_types": { 00:11:43.459 "read": true, 00:11:43.459 "write": true, 00:11:43.459 "unmap": true, 00:11:43.459 "write_zeroes": true, 00:11:43.459 "flush": true, 00:11:43.459 "reset": true, 00:11:43.459 "compare": false, 00:11:43.459 "compare_and_write": false, 00:11:43.459 "abort": true, 00:11:43.459 "nvme_admin": false, 00:11:43.459 "nvme_io": false 00:11:43.459 }, 00:11:43.459 "memory_domains": [ 00:11:43.459 { 00:11:43.459 "dma_device_id": "system", 00:11:43.459 "dma_device_type": 1 00:11:43.459 }, 00:11:43.459 { 00:11:43.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.459 "dma_device_type": 2 00:11:43.459 } 00:11:43.459 ], 00:11:43.459 "driver_specific": {} 00:11:43.459 }' 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:43.459 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:43.718 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:43.976 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:43.976 "name": "BaseBdev2", 00:11:43.976 "aliases": [ 00:11:43.976 "6d2f9e5c-345e-4a2b-89d9-696524e1eb72" 00:11:43.976 ], 00:11:43.976 "product_name": "Malloc disk", 00:11:43.976 "block_size": 512, 00:11:43.976 "num_blocks": 65536, 00:11:43.976 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:43.976 "assigned_rate_limits": { 00:11:43.976 "rw_ios_per_sec": 0, 00:11:43.976 "rw_mbytes_per_sec": 0, 00:11:43.976 "r_mbytes_per_sec": 0, 00:11:43.976 "w_mbytes_per_sec": 0 00:11:43.976 }, 00:11:43.976 "claimed": true, 00:11:43.976 "claim_type": "exclusive_write", 00:11:43.976 "zoned": false, 00:11:43.976 "supported_io_types": { 00:11:43.976 "read": true, 00:11:43.976 "write": true, 00:11:43.976 "unmap": true, 00:11:43.976 "write_zeroes": true, 00:11:43.976 "flush": true, 00:11:43.976 "reset": true, 00:11:43.976 "compare": false, 00:11:43.976 "compare_and_write": false, 00:11:43.976 "abort": true, 00:11:43.976 "nvme_admin": false, 00:11:43.976 "nvme_io": false 00:11:43.976 }, 00:11:43.976 "memory_domains": [ 00:11:43.976 { 00:11:43.976 "dma_device_id": "system", 00:11:43.976 "dma_device_type": 1 00:11:43.976 }, 00:11:43.976 { 00:11:43.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.976 "dma_device_type": 2 00:11:43.976 } 00:11:43.976 ], 00:11:43.976 "driver_specific": {} 00:11:43.976 }' 00:11:43.976 11:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:43.976 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:44.233 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:44.492 "name": "BaseBdev3", 00:11:44.492 "aliases": [ 00:11:44.492 "bcc3771d-e70b-4ce6-8365-70edcb60c340" 00:11:44.492 ], 00:11:44.492 "product_name": "Malloc disk", 00:11:44.492 "block_size": 512, 00:11:44.492 "num_blocks": 65536, 00:11:44.492 "uuid": "bcc3771d-e70b-4ce6-8365-70edcb60c340", 00:11:44.492 "assigned_rate_limits": { 00:11:44.492 "rw_ios_per_sec": 0, 00:11:44.492 "rw_mbytes_per_sec": 0, 00:11:44.492 "r_mbytes_per_sec": 0, 00:11:44.492 "w_mbytes_per_sec": 0 00:11:44.492 }, 00:11:44.492 "claimed": true, 00:11:44.492 "claim_type": "exclusive_write", 00:11:44.492 "zoned": false, 00:11:44.492 "supported_io_types": { 00:11:44.492 "read": true, 00:11:44.492 "write": true, 00:11:44.492 "unmap": true, 00:11:44.492 "write_zeroes": true, 00:11:44.492 "flush": true, 00:11:44.492 "reset": true, 00:11:44.492 "compare": false, 00:11:44.492 "compare_and_write": false, 00:11:44.492 "abort": true, 00:11:44.492 "nvme_admin": false, 00:11:44.492 "nvme_io": false 00:11:44.492 }, 00:11:44.492 "memory_domains": [ 00:11:44.492 { 00:11:44.492 "dma_device_id": "system", 00:11:44.492 "dma_device_type": 1 00:11:44.492 }, 00:11:44.492 { 00:11:44.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.492 "dma_device_type": 2 00:11:44.492 } 00:11:44.492 ], 00:11:44.492 "driver_specific": {} 00:11:44.492 }' 00:11:44.492 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.751 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:45.009 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:45.009 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:45.009 11:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:45.267 [2024-05-14 11:48:12.131028] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:45.267 [2024-05-14 11:48:12.131054] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:45.267 [2024-05-14 11:48:12.131097] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.267 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.525 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:45.525 "name": "Existed_Raid", 00:11:45.525 "uuid": "8d89d6a1-0bea-4901-9e91-d43d28c9e7b7", 00:11:45.525 "strip_size_kb": 64, 00:11:45.525 "state": "offline", 00:11:45.525 "raid_level": "raid0", 00:11:45.525 "superblock": false, 00:11:45.525 "num_base_bdevs": 3, 00:11:45.525 "num_base_bdevs_discovered": 2, 00:11:45.525 "num_base_bdevs_operational": 2, 00:11:45.525 "base_bdevs_list": [ 00:11:45.525 { 00:11:45.525 "name": null, 00:11:45.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.525 "is_configured": false, 00:11:45.525 "data_offset": 0, 00:11:45.525 "data_size": 65536 00:11:45.525 }, 00:11:45.526 { 00:11:45.526 "name": "BaseBdev2", 00:11:45.526 "uuid": "6d2f9e5c-345e-4a2b-89d9-696524e1eb72", 00:11:45.526 "is_configured": true, 00:11:45.526 "data_offset": 0, 00:11:45.526 "data_size": 65536 00:11:45.526 }, 00:11:45.526 { 00:11:45.526 "name": "BaseBdev3", 00:11:45.526 "uuid": "bcc3771d-e70b-4ce6-8365-70edcb60c340", 00:11:45.526 "is_configured": true, 00:11:45.526 "data_offset": 0, 00:11:45.526 "data_size": 65536 00:11:45.526 } 00:11:45.526 ] 00:11:45.526 }' 00:11:45.526 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:45.526 11:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.092 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:46.092 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:46.092 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:46.092 11:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.350 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:46.350 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:46.350 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:46.609 [2024-05-14 11:48:13.460469] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:46.609 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:46.609 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:46.609 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.609 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:46.867 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:46.867 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:46.867 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:47.125 [2024-05-14 11:48:13.954171] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:47.125 [2024-05-14 11:48:13.954213] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258b080 name Existed_Raid, state offline 00:11:47.125 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:47.125 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:47.125 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.125 11:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:47.383 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:47.383 BaseBdev2 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:47.642 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:47.900 [ 00:11:47.900 { 00:11:47.900 "name": "BaseBdev2", 00:11:47.900 "aliases": [ 00:11:47.900 "77be4c58-8071-4340-a519-1f6bf0bc4513" 00:11:47.901 ], 00:11:47.901 "product_name": "Malloc disk", 00:11:47.901 "block_size": 512, 00:11:47.901 "num_blocks": 65536, 00:11:47.901 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:47.901 "assigned_rate_limits": { 00:11:47.901 "rw_ios_per_sec": 0, 00:11:47.901 "rw_mbytes_per_sec": 0, 00:11:47.901 "r_mbytes_per_sec": 0, 00:11:47.901 "w_mbytes_per_sec": 0 00:11:47.901 }, 00:11:47.901 "claimed": false, 00:11:47.901 "zoned": false, 00:11:47.901 "supported_io_types": { 00:11:47.901 "read": true, 00:11:47.901 "write": true, 00:11:47.901 "unmap": true, 00:11:47.901 "write_zeroes": true, 00:11:47.901 "flush": true, 00:11:47.901 "reset": true, 00:11:47.901 "compare": false, 00:11:47.901 "compare_and_write": false, 00:11:47.901 "abort": true, 00:11:47.901 "nvme_admin": false, 00:11:47.901 "nvme_io": false 00:11:47.901 }, 00:11:47.901 "memory_domains": [ 00:11:47.901 { 00:11:47.901 "dma_device_id": "system", 00:11:47.901 "dma_device_type": 1 00:11:47.901 }, 00:11:47.901 { 00:11:47.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.901 "dma_device_type": 2 00:11:47.901 } 00:11:47.901 ], 00:11:47.901 "driver_specific": {} 00:11:47.901 } 00:11:47.901 ] 00:11:47.901 11:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:47.901 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:11:47.901 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:47.901 11:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:48.159 BaseBdev3 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:48.159 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:48.416 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:48.674 [ 00:11:48.674 { 00:11:48.674 "name": "BaseBdev3", 00:11:48.674 "aliases": [ 00:11:48.674 "e25c6573-b006-49ac-bc97-a69c5aefe240" 00:11:48.674 ], 00:11:48.674 "product_name": "Malloc disk", 00:11:48.674 "block_size": 512, 00:11:48.674 "num_blocks": 65536, 00:11:48.674 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:48.674 "assigned_rate_limits": { 00:11:48.674 "rw_ios_per_sec": 0, 00:11:48.674 "rw_mbytes_per_sec": 0, 00:11:48.674 "r_mbytes_per_sec": 0, 00:11:48.674 "w_mbytes_per_sec": 0 00:11:48.674 }, 00:11:48.674 "claimed": false, 00:11:48.674 "zoned": false, 00:11:48.674 "supported_io_types": { 00:11:48.674 "read": true, 00:11:48.674 "write": true, 00:11:48.674 "unmap": true, 00:11:48.674 "write_zeroes": true, 00:11:48.674 "flush": true, 00:11:48.674 "reset": true, 00:11:48.674 "compare": false, 00:11:48.674 "compare_and_write": false, 00:11:48.674 "abort": true, 00:11:48.674 "nvme_admin": false, 00:11:48.674 "nvme_io": false 00:11:48.674 }, 00:11:48.674 "memory_domains": [ 00:11:48.674 { 00:11:48.674 "dma_device_id": "system", 00:11:48.674 "dma_device_type": 1 00:11:48.674 }, 00:11:48.674 { 00:11:48.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.674 "dma_device_type": 2 00:11:48.674 } 00:11:48.674 ], 00:11:48.674 "driver_specific": {} 00:11:48.674 } 00:11:48.674 ] 00:11:48.674 11:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:48.674 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:11:48.674 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:48.674 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:48.932 [2024-05-14 11:48:15.851744] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:48.932 [2024-05-14 11:48:15.851786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:48.932 [2024-05-14 11:48:15.851813] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:48.932 [2024-05-14 11:48:15.853199] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.932 11:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.189 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:49.189 "name": "Existed_Raid", 00:11:49.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.189 "strip_size_kb": 64, 00:11:49.189 "state": "configuring", 00:11:49.189 "raid_level": "raid0", 00:11:49.189 "superblock": false, 00:11:49.189 "num_base_bdevs": 3, 00:11:49.189 "num_base_bdevs_discovered": 2, 00:11:49.189 "num_base_bdevs_operational": 3, 00:11:49.189 "base_bdevs_list": [ 00:11:49.189 { 00:11:49.189 "name": "BaseBdev1", 00:11:49.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.189 "is_configured": false, 00:11:49.189 "data_offset": 0, 00:11:49.189 "data_size": 0 00:11:49.189 }, 00:11:49.189 { 00:11:49.189 "name": "BaseBdev2", 00:11:49.189 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:49.189 "is_configured": true, 00:11:49.189 "data_offset": 0, 00:11:49.189 "data_size": 65536 00:11:49.189 }, 00:11:49.189 { 00:11:49.189 "name": "BaseBdev3", 00:11:49.189 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:49.189 "is_configured": true, 00:11:49.189 "data_offset": 0, 00:11:49.189 "data_size": 65536 00:11:49.189 } 00:11:49.189 ] 00:11:49.189 }' 00:11:49.189 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:49.189 11:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.753 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:50.010 [2024-05-14 11:48:16.930583] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.010 11:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.268 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:50.268 "name": "Existed_Raid", 00:11:50.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.268 "strip_size_kb": 64, 00:11:50.268 "state": "configuring", 00:11:50.268 "raid_level": "raid0", 00:11:50.268 "superblock": false, 00:11:50.268 "num_base_bdevs": 3, 00:11:50.268 "num_base_bdevs_discovered": 1, 00:11:50.268 "num_base_bdevs_operational": 3, 00:11:50.268 "base_bdevs_list": [ 00:11:50.268 { 00:11:50.268 "name": "BaseBdev1", 00:11:50.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.268 "is_configured": false, 00:11:50.268 "data_offset": 0, 00:11:50.268 "data_size": 0 00:11:50.268 }, 00:11:50.268 { 00:11:50.268 "name": null, 00:11:50.268 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:50.268 "is_configured": false, 00:11:50.268 "data_offset": 0, 00:11:50.268 "data_size": 65536 00:11:50.268 }, 00:11:50.268 { 00:11:50.268 "name": "BaseBdev3", 00:11:50.268 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:50.269 "is_configured": true, 00:11:50.269 "data_offset": 0, 00:11:50.269 "data_size": 65536 00:11:50.269 } 00:11:50.269 ] 00:11:50.269 }' 00:11:50.269 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:50.269 11:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.836 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.836 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:51.095 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:11:51.095 11:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:51.353 [2024-05-14 11:48:18.205311] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:51.353 BaseBdev1 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:51.353 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:51.611 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:51.611 [ 00:11:51.611 { 00:11:51.611 "name": "BaseBdev1", 00:11:51.611 "aliases": [ 00:11:51.611 "953a054b-2164-458e-8df6-9f7a6c01dee5" 00:11:51.611 ], 00:11:51.611 "product_name": "Malloc disk", 00:11:51.611 "block_size": 512, 00:11:51.611 "num_blocks": 65536, 00:11:51.611 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:51.611 "assigned_rate_limits": { 00:11:51.611 "rw_ios_per_sec": 0, 00:11:51.611 "rw_mbytes_per_sec": 0, 00:11:51.611 "r_mbytes_per_sec": 0, 00:11:51.611 "w_mbytes_per_sec": 0 00:11:51.611 }, 00:11:51.611 "claimed": true, 00:11:51.611 "claim_type": "exclusive_write", 00:11:51.611 "zoned": false, 00:11:51.611 "supported_io_types": { 00:11:51.611 "read": true, 00:11:51.611 "write": true, 00:11:51.611 "unmap": true, 00:11:51.611 "write_zeroes": true, 00:11:51.611 "flush": true, 00:11:51.611 "reset": true, 00:11:51.611 "compare": false, 00:11:51.611 "compare_and_write": false, 00:11:51.611 "abort": true, 00:11:51.611 "nvme_admin": false, 00:11:51.611 "nvme_io": false 00:11:51.611 }, 00:11:51.611 "memory_domains": [ 00:11:51.611 { 00:11:51.611 "dma_device_id": "system", 00:11:51.611 "dma_device_type": 1 00:11:51.611 }, 00:11:51.611 { 00:11:51.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.611 "dma_device_type": 2 00:11:51.611 } 00:11:51.611 ], 00:11:51.611 "driver_specific": {} 00:11:51.611 } 00:11:51.611 ] 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.869 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.127 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:52.127 "name": "Existed_Raid", 00:11:52.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.127 "strip_size_kb": 64, 00:11:52.127 "state": "configuring", 00:11:52.127 "raid_level": "raid0", 00:11:52.127 "superblock": false, 00:11:52.127 "num_base_bdevs": 3, 00:11:52.127 "num_base_bdevs_discovered": 2, 00:11:52.127 "num_base_bdevs_operational": 3, 00:11:52.127 "base_bdevs_list": [ 00:11:52.127 { 00:11:52.127 "name": "BaseBdev1", 00:11:52.127 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:52.127 "is_configured": true, 00:11:52.127 "data_offset": 0, 00:11:52.127 "data_size": 65536 00:11:52.127 }, 00:11:52.127 { 00:11:52.127 "name": null, 00:11:52.127 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:52.127 "is_configured": false, 00:11:52.127 "data_offset": 0, 00:11:52.127 "data_size": 65536 00:11:52.127 }, 00:11:52.127 { 00:11:52.127 "name": "BaseBdev3", 00:11:52.127 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:52.127 "is_configured": true, 00:11:52.127 "data_offset": 0, 00:11:52.127 "data_size": 65536 00:11:52.127 } 00:11:52.127 ] 00:11:52.127 }' 00:11:52.127 11:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:52.127 11:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.693 11:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.693 11:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:52.964 11:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:11:52.964 11:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:52.964 [2024-05-14 11:48:20.038217] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.233 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:53.233 "name": "Existed_Raid", 00:11:53.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.233 "strip_size_kb": 64, 00:11:53.233 "state": "configuring", 00:11:53.233 "raid_level": "raid0", 00:11:53.233 "superblock": false, 00:11:53.233 "num_base_bdevs": 3, 00:11:53.233 "num_base_bdevs_discovered": 1, 00:11:53.233 "num_base_bdevs_operational": 3, 00:11:53.233 "base_bdevs_list": [ 00:11:53.233 { 00:11:53.233 "name": "BaseBdev1", 00:11:53.233 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:53.233 "is_configured": true, 00:11:53.233 "data_offset": 0, 00:11:53.233 "data_size": 65536 00:11:53.233 }, 00:11:53.233 { 00:11:53.233 "name": null, 00:11:53.233 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:53.233 "is_configured": false, 00:11:53.233 "data_offset": 0, 00:11:53.233 "data_size": 65536 00:11:53.233 }, 00:11:53.233 { 00:11:53.233 "name": null, 00:11:53.233 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:53.233 "is_configured": false, 00:11:53.233 "data_offset": 0, 00:11:53.233 "data_size": 65536 00:11:53.233 } 00:11:53.233 ] 00:11:53.233 }' 00:11:53.490 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:53.490 11:48:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.056 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.056 11:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:54.057 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:11:54.057 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:54.315 [2024-05-14 11:48:21.313634] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.315 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.573 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:54.573 "name": "Existed_Raid", 00:11:54.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.573 "strip_size_kb": 64, 00:11:54.573 "state": "configuring", 00:11:54.573 "raid_level": "raid0", 00:11:54.573 "superblock": false, 00:11:54.573 "num_base_bdevs": 3, 00:11:54.573 "num_base_bdevs_discovered": 2, 00:11:54.573 "num_base_bdevs_operational": 3, 00:11:54.573 "base_bdevs_list": [ 00:11:54.573 { 00:11:54.573 "name": "BaseBdev1", 00:11:54.573 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:54.573 "is_configured": true, 00:11:54.573 "data_offset": 0, 00:11:54.573 "data_size": 65536 00:11:54.573 }, 00:11:54.573 { 00:11:54.573 "name": null, 00:11:54.573 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:54.573 "is_configured": false, 00:11:54.573 "data_offset": 0, 00:11:54.573 "data_size": 65536 00:11:54.573 }, 00:11:54.573 { 00:11:54.573 "name": "BaseBdev3", 00:11:54.573 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:54.573 "is_configured": true, 00:11:54.573 "data_offset": 0, 00:11:54.573 "data_size": 65536 00:11:54.573 } 00:11:54.573 ] 00:11:54.573 }' 00:11:54.573 11:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:54.573 11:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.138 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.138 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:55.396 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:11:55.396 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:55.653 [2024-05-14 11:48:22.625114] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.653 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.911 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:55.911 "name": "Existed_Raid", 00:11:55.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.911 "strip_size_kb": 64, 00:11:55.911 "state": "configuring", 00:11:55.911 "raid_level": "raid0", 00:11:55.911 "superblock": false, 00:11:55.911 "num_base_bdevs": 3, 00:11:55.911 "num_base_bdevs_discovered": 1, 00:11:55.911 "num_base_bdevs_operational": 3, 00:11:55.911 "base_bdevs_list": [ 00:11:55.911 { 00:11:55.911 "name": null, 00:11:55.911 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:55.911 "is_configured": false, 00:11:55.911 "data_offset": 0, 00:11:55.911 "data_size": 65536 00:11:55.911 }, 00:11:55.911 { 00:11:55.911 "name": null, 00:11:55.911 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:55.911 "is_configured": false, 00:11:55.911 "data_offset": 0, 00:11:55.911 "data_size": 65536 00:11:55.911 }, 00:11:55.911 { 00:11:55.911 "name": "BaseBdev3", 00:11:55.911 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:55.911 "is_configured": true, 00:11:55.911 "data_offset": 0, 00:11:55.911 "data_size": 65536 00:11:55.911 } 00:11:55.911 ] 00:11:55.911 }' 00:11:55.911 11:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:55.911 11:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.476 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.476 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:56.733 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:11:56.733 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:56.991 [2024-05-14 11:48:23.951194] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.991 11:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.247 11:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:57.248 "name": "Existed_Raid", 00:11:57.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.248 "strip_size_kb": 64, 00:11:57.248 "state": "configuring", 00:11:57.248 "raid_level": "raid0", 00:11:57.248 "superblock": false, 00:11:57.248 "num_base_bdevs": 3, 00:11:57.248 "num_base_bdevs_discovered": 2, 00:11:57.248 "num_base_bdevs_operational": 3, 00:11:57.248 "base_bdevs_list": [ 00:11:57.248 { 00:11:57.248 "name": null, 00:11:57.248 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:57.248 "is_configured": false, 00:11:57.248 "data_offset": 0, 00:11:57.248 "data_size": 65536 00:11:57.248 }, 00:11:57.248 { 00:11:57.248 "name": "BaseBdev2", 00:11:57.248 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:57.248 "is_configured": true, 00:11:57.248 "data_offset": 0, 00:11:57.248 "data_size": 65536 00:11:57.248 }, 00:11:57.248 { 00:11:57.248 "name": "BaseBdev3", 00:11:57.248 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:57.248 "is_configured": true, 00:11:57.248 "data_offset": 0, 00:11:57.248 "data_size": 65536 00:11:57.248 } 00:11:57.248 ] 00:11:57.248 }' 00:11:57.248 11:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:57.248 11:48:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.811 11:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.811 11:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:58.068 11:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:11:58.068 11:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.068 11:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:58.326 11:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 953a054b-2164-458e-8df6-9f7a6c01dee5 00:11:58.583 [2024-05-14 11:48:25.526659] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:58.583 [2024-05-14 11:48:25.526697] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x272eec0 00:11:58.583 [2024-05-14 11:48:25.526706] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:58.583 [2024-05-14 11:48:25.526901] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2733f00 00:11:58.583 [2024-05-14 11:48:25.527028] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x272eec0 00:11:58.583 [2024-05-14 11:48:25.527038] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x272eec0 00:11:58.583 [2024-05-14 11:48:25.527195] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:58.583 NewBaseBdev 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:58.583 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:58.841 11:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:59.098 [ 00:11:59.098 { 00:11:59.098 "name": "NewBaseBdev", 00:11:59.098 "aliases": [ 00:11:59.098 "953a054b-2164-458e-8df6-9f7a6c01dee5" 00:11:59.098 ], 00:11:59.098 "product_name": "Malloc disk", 00:11:59.098 "block_size": 512, 00:11:59.098 "num_blocks": 65536, 00:11:59.098 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:59.098 "assigned_rate_limits": { 00:11:59.098 "rw_ios_per_sec": 0, 00:11:59.098 "rw_mbytes_per_sec": 0, 00:11:59.098 "r_mbytes_per_sec": 0, 00:11:59.098 "w_mbytes_per_sec": 0 00:11:59.098 }, 00:11:59.098 "claimed": true, 00:11:59.098 "claim_type": "exclusive_write", 00:11:59.098 "zoned": false, 00:11:59.098 "supported_io_types": { 00:11:59.098 "read": true, 00:11:59.098 "write": true, 00:11:59.098 "unmap": true, 00:11:59.098 "write_zeroes": true, 00:11:59.098 "flush": true, 00:11:59.098 "reset": true, 00:11:59.098 "compare": false, 00:11:59.098 "compare_and_write": false, 00:11:59.098 "abort": true, 00:11:59.098 "nvme_admin": false, 00:11:59.098 "nvme_io": false 00:11:59.098 }, 00:11:59.098 "memory_domains": [ 00:11:59.098 { 00:11:59.098 "dma_device_id": "system", 00:11:59.098 "dma_device_type": 1 00:11:59.098 }, 00:11:59.098 { 00:11:59.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.098 "dma_device_type": 2 00:11:59.098 } 00:11:59.098 ], 00:11:59.098 "driver_specific": {} 00:11:59.098 } 00:11:59.098 ] 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:59.098 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.099 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.356 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:59.356 "name": "Existed_Raid", 00:11:59.356 "uuid": "15b6f7a4-ebe2-4de5-8450-51d04213930f", 00:11:59.356 "strip_size_kb": 64, 00:11:59.356 "state": "online", 00:11:59.356 "raid_level": "raid0", 00:11:59.356 "superblock": false, 00:11:59.356 "num_base_bdevs": 3, 00:11:59.356 "num_base_bdevs_discovered": 3, 00:11:59.356 "num_base_bdevs_operational": 3, 00:11:59.356 "base_bdevs_list": [ 00:11:59.356 { 00:11:59.356 "name": "NewBaseBdev", 00:11:59.356 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:11:59.356 "is_configured": true, 00:11:59.356 "data_offset": 0, 00:11:59.356 "data_size": 65536 00:11:59.356 }, 00:11:59.356 { 00:11:59.356 "name": "BaseBdev2", 00:11:59.356 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:11:59.356 "is_configured": true, 00:11:59.356 "data_offset": 0, 00:11:59.356 "data_size": 65536 00:11:59.356 }, 00:11:59.356 { 00:11:59.356 "name": "BaseBdev3", 00:11:59.356 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:11:59.356 "is_configured": true, 00:11:59.356 "data_offset": 0, 00:11:59.356 "data_size": 65536 00:11:59.356 } 00:11:59.356 ] 00:11:59.356 }' 00:11:59.356 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:59.356 11:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:59.922 11:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:00.181 [2024-05-14 11:48:27.099132] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:00.181 "name": "Existed_Raid", 00:12:00.181 "aliases": [ 00:12:00.181 "15b6f7a4-ebe2-4de5-8450-51d04213930f" 00:12:00.181 ], 00:12:00.181 "product_name": "Raid Volume", 00:12:00.181 "block_size": 512, 00:12:00.181 "num_blocks": 196608, 00:12:00.181 "uuid": "15b6f7a4-ebe2-4de5-8450-51d04213930f", 00:12:00.181 "assigned_rate_limits": { 00:12:00.181 "rw_ios_per_sec": 0, 00:12:00.181 "rw_mbytes_per_sec": 0, 00:12:00.181 "r_mbytes_per_sec": 0, 00:12:00.181 "w_mbytes_per_sec": 0 00:12:00.181 }, 00:12:00.181 "claimed": false, 00:12:00.181 "zoned": false, 00:12:00.181 "supported_io_types": { 00:12:00.181 "read": true, 00:12:00.181 "write": true, 00:12:00.181 "unmap": true, 00:12:00.181 "write_zeroes": true, 00:12:00.181 "flush": true, 00:12:00.181 "reset": true, 00:12:00.181 "compare": false, 00:12:00.181 "compare_and_write": false, 00:12:00.181 "abort": false, 00:12:00.181 "nvme_admin": false, 00:12:00.181 "nvme_io": false 00:12:00.181 }, 00:12:00.181 "memory_domains": [ 00:12:00.181 { 00:12:00.181 "dma_device_id": "system", 00:12:00.181 "dma_device_type": 1 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.181 "dma_device_type": 2 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "system", 00:12:00.181 "dma_device_type": 1 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.181 "dma_device_type": 2 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "system", 00:12:00.181 "dma_device_type": 1 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.181 "dma_device_type": 2 00:12:00.181 } 00:12:00.181 ], 00:12:00.181 "driver_specific": { 00:12:00.181 "raid": { 00:12:00.181 "uuid": "15b6f7a4-ebe2-4de5-8450-51d04213930f", 00:12:00.181 "strip_size_kb": 64, 00:12:00.181 "state": "online", 00:12:00.181 "raid_level": "raid0", 00:12:00.181 "superblock": false, 00:12:00.181 "num_base_bdevs": 3, 00:12:00.181 "num_base_bdevs_discovered": 3, 00:12:00.181 "num_base_bdevs_operational": 3, 00:12:00.181 "base_bdevs_list": [ 00:12:00.181 { 00:12:00.181 "name": "NewBaseBdev", 00:12:00.181 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:12:00.181 "is_configured": true, 00:12:00.181 "data_offset": 0, 00:12:00.181 "data_size": 65536 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "name": "BaseBdev2", 00:12:00.181 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:12:00.181 "is_configured": true, 00:12:00.181 "data_offset": 0, 00:12:00.181 "data_size": 65536 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "name": "BaseBdev3", 00:12:00.181 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:12:00.181 "is_configured": true, 00:12:00.181 "data_offset": 0, 00:12:00.181 "data_size": 65536 00:12:00.181 } 00:12:00.181 ] 00:12:00.181 } 00:12:00.181 } 00:12:00.181 }' 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:00.181 BaseBdev2 00:12:00.181 BaseBdev3' 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:00.181 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:00.439 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:00.439 "name": "NewBaseBdev", 00:12:00.439 "aliases": [ 00:12:00.439 "953a054b-2164-458e-8df6-9f7a6c01dee5" 00:12:00.439 ], 00:12:00.439 "product_name": "Malloc disk", 00:12:00.439 "block_size": 512, 00:12:00.439 "num_blocks": 65536, 00:12:00.439 "uuid": "953a054b-2164-458e-8df6-9f7a6c01dee5", 00:12:00.439 "assigned_rate_limits": { 00:12:00.439 "rw_ios_per_sec": 0, 00:12:00.439 "rw_mbytes_per_sec": 0, 00:12:00.439 "r_mbytes_per_sec": 0, 00:12:00.439 "w_mbytes_per_sec": 0 00:12:00.439 }, 00:12:00.439 "claimed": true, 00:12:00.439 "claim_type": "exclusive_write", 00:12:00.439 "zoned": false, 00:12:00.439 "supported_io_types": { 00:12:00.439 "read": true, 00:12:00.439 "write": true, 00:12:00.439 "unmap": true, 00:12:00.439 "write_zeroes": true, 00:12:00.439 "flush": true, 00:12:00.439 "reset": true, 00:12:00.439 "compare": false, 00:12:00.439 "compare_and_write": false, 00:12:00.439 "abort": true, 00:12:00.439 "nvme_admin": false, 00:12:00.439 "nvme_io": false 00:12:00.439 }, 00:12:00.439 "memory_domains": [ 00:12:00.439 { 00:12:00.439 "dma_device_id": "system", 00:12:00.439 "dma_device_type": 1 00:12:00.439 }, 00:12:00.439 { 00:12:00.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.439 "dma_device_type": 2 00:12:00.439 } 00:12:00.439 ], 00:12:00.439 "driver_specific": {} 00:12:00.439 }' 00:12:00.439 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:00.439 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:00.439 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:00.439 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:00.697 11:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:00.955 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:00.955 "name": "BaseBdev2", 00:12:00.955 "aliases": [ 00:12:00.955 "77be4c58-8071-4340-a519-1f6bf0bc4513" 00:12:00.955 ], 00:12:00.955 "product_name": "Malloc disk", 00:12:00.955 "block_size": 512, 00:12:00.955 "num_blocks": 65536, 00:12:00.955 "uuid": "77be4c58-8071-4340-a519-1f6bf0bc4513", 00:12:00.955 "assigned_rate_limits": { 00:12:00.955 "rw_ios_per_sec": 0, 00:12:00.955 "rw_mbytes_per_sec": 0, 00:12:00.955 "r_mbytes_per_sec": 0, 00:12:00.955 "w_mbytes_per_sec": 0 00:12:00.955 }, 00:12:00.955 "claimed": true, 00:12:00.955 "claim_type": "exclusive_write", 00:12:00.955 "zoned": false, 00:12:00.955 "supported_io_types": { 00:12:00.955 "read": true, 00:12:00.955 "write": true, 00:12:00.955 "unmap": true, 00:12:00.955 "write_zeroes": true, 00:12:00.955 "flush": true, 00:12:00.955 "reset": true, 00:12:00.955 "compare": false, 00:12:00.955 "compare_and_write": false, 00:12:00.955 "abort": true, 00:12:00.955 "nvme_admin": false, 00:12:00.955 "nvme_io": false 00:12:00.955 }, 00:12:00.955 "memory_domains": [ 00:12:00.955 { 00:12:00.955 "dma_device_id": "system", 00:12:00.955 "dma_device_type": 1 00:12:00.955 }, 00:12:00.955 { 00:12:00.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.955 "dma_device_type": 2 00:12:00.955 } 00:12:00.955 ], 00:12:00.955 "driver_specific": {} 00:12:00.955 }' 00:12:00.955 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:01.213 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:01.213 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:01.213 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:01.214 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:01.471 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:01.471 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:01.471 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:01.471 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:01.472 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:01.472 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:01.472 "name": "BaseBdev3", 00:12:01.472 "aliases": [ 00:12:01.472 "e25c6573-b006-49ac-bc97-a69c5aefe240" 00:12:01.472 ], 00:12:01.472 "product_name": "Malloc disk", 00:12:01.472 "block_size": 512, 00:12:01.472 "num_blocks": 65536, 00:12:01.472 "uuid": "e25c6573-b006-49ac-bc97-a69c5aefe240", 00:12:01.472 "assigned_rate_limits": { 00:12:01.472 "rw_ios_per_sec": 0, 00:12:01.472 "rw_mbytes_per_sec": 0, 00:12:01.472 "r_mbytes_per_sec": 0, 00:12:01.472 "w_mbytes_per_sec": 0 00:12:01.472 }, 00:12:01.472 "claimed": true, 00:12:01.472 "claim_type": "exclusive_write", 00:12:01.472 "zoned": false, 00:12:01.472 "supported_io_types": { 00:12:01.472 "read": true, 00:12:01.472 "write": true, 00:12:01.472 "unmap": true, 00:12:01.472 "write_zeroes": true, 00:12:01.472 "flush": true, 00:12:01.472 "reset": true, 00:12:01.472 "compare": false, 00:12:01.472 "compare_and_write": false, 00:12:01.472 "abort": true, 00:12:01.472 "nvme_admin": false, 00:12:01.472 "nvme_io": false 00:12:01.472 }, 00:12:01.472 "memory_domains": [ 00:12:01.472 { 00:12:01.472 "dma_device_id": "system", 00:12:01.472 "dma_device_type": 1 00:12:01.472 }, 00:12:01.472 { 00:12:01.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.472 "dma_device_type": 2 00:12:01.472 } 00:12:01.472 ], 00:12:01.472 "driver_specific": {} 00:12:01.472 }' 00:12:01.472 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:01.730 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:01.988 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:01.988 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:01.988 11:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:02.247 [2024-05-14 11:48:29.088156] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:02.247 [2024-05-14 11:48:29.088182] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:02.247 [2024-05-14 11:48:29.088237] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:02.247 [2024-05-14 11:48:29.088296] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:02.247 [2024-05-14 11:48:29.088308] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x272eec0 name Existed_Raid, state offline 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1678548 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1678548 ']' 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1678548 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1678548 00:12:02.247 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:02.248 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:02.248 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1678548' 00:12:02.248 killing process with pid 1678548 00:12:02.248 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1678548 00:12:02.248 [2024-05-14 11:48:29.160223] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:02.248 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1678548 00:12:02.248 [2024-05-14 11:48:29.187729] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:12:02.507 00:12:02.507 real 0m27.951s 00:12:02.507 user 0m51.335s 00:12:02.507 sys 0m4.983s 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.507 ************************************ 00:12:02.507 END TEST raid_state_function_test 00:12:02.507 ************************************ 00:12:02.507 11:48:29 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:02.507 11:48:29 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:12:02.507 11:48:29 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:02.507 11:48:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:02.507 ************************************ 00:12:02.507 START TEST raid_state_function_test_sb 00:12:02.507 ************************************ 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 true 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1683250 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1683250' 00:12:02.507 Process raid pid: 1683250 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1683250 /var/tmp/spdk-raid.sock 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1683250 ']' 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:02.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:02.507 11:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:02.507 [2024-05-14 11:48:29.571388] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:12:02.507 [2024-05-14 11:48:29.571471] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:02.766 [2024-05-14 11:48:29.702803] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:02.766 [2024-05-14 11:48:29.804622] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.025 [2024-05-14 11:48:29.865242] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.025 [2024-05-14 11:48:29.865276] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.592 11:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:03.592 11:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:12:03.592 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:03.850 [2024-05-14 11:48:30.730022] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:03.850 [2024-05-14 11:48:30.730065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:03.850 [2024-05-14 11:48:30.730076] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:03.850 [2024-05-14 11:48:30.730092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:03.850 [2024-05-14 11:48:30.730101] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:03.850 [2024-05-14 11:48:30.730120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.850 11:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.108 11:48:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:04.108 "name": "Existed_Raid", 00:12:04.108 "uuid": "a8e2721d-2524-4287-b179-54c7252e6734", 00:12:04.108 "strip_size_kb": 64, 00:12:04.108 "state": "configuring", 00:12:04.108 "raid_level": "raid0", 00:12:04.108 "superblock": true, 00:12:04.108 "num_base_bdevs": 3, 00:12:04.108 "num_base_bdevs_discovered": 0, 00:12:04.108 "num_base_bdevs_operational": 3, 00:12:04.108 "base_bdevs_list": [ 00:12:04.108 { 00:12:04.108 "name": "BaseBdev1", 00:12:04.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.108 "is_configured": false, 00:12:04.108 "data_offset": 0, 00:12:04.108 "data_size": 0 00:12:04.108 }, 00:12:04.108 { 00:12:04.108 "name": "BaseBdev2", 00:12:04.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.108 "is_configured": false, 00:12:04.108 "data_offset": 0, 00:12:04.108 "data_size": 0 00:12:04.108 }, 00:12:04.108 { 00:12:04.108 "name": "BaseBdev3", 00:12:04.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.108 "is_configured": false, 00:12:04.108 "data_offset": 0, 00:12:04.108 "data_size": 0 00:12:04.108 } 00:12:04.108 ] 00:12:04.108 }' 00:12:04.109 11:48:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:04.109 11:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:04.675 11:48:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:04.934 [2024-05-14 11:48:31.800464] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:04.934 [2024-05-14 11:48:31.800495] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b36700 name Existed_Raid, state configuring 00:12:04.934 11:48:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:04.934 [2024-05-14 11:48:31.997007] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.934 [2024-05-14 11:48:31.997033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.934 [2024-05-14 11:48:31.997043] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.934 [2024-05-14 11:48:31.997055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.934 [2024-05-14 11:48:31.997063] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:04.934 [2024-05-14 11:48:31.997079] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:04.934 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:05.192 [2024-05-14 11:48:32.183387] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:05.192 BaseBdev1 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:05.192 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:05.449 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:05.708 [ 00:12:05.708 { 00:12:05.708 "name": "BaseBdev1", 00:12:05.708 "aliases": [ 00:12:05.708 "6f928283-40bc-440f-946d-1aea6e98eb46" 00:12:05.708 ], 00:12:05.708 "product_name": "Malloc disk", 00:12:05.708 "block_size": 512, 00:12:05.708 "num_blocks": 65536, 00:12:05.708 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:05.708 "assigned_rate_limits": { 00:12:05.708 "rw_ios_per_sec": 0, 00:12:05.708 "rw_mbytes_per_sec": 0, 00:12:05.708 "r_mbytes_per_sec": 0, 00:12:05.708 "w_mbytes_per_sec": 0 00:12:05.708 }, 00:12:05.708 "claimed": true, 00:12:05.708 "claim_type": "exclusive_write", 00:12:05.708 "zoned": false, 00:12:05.708 "supported_io_types": { 00:12:05.708 "read": true, 00:12:05.708 "write": true, 00:12:05.708 "unmap": true, 00:12:05.708 "write_zeroes": true, 00:12:05.708 "flush": true, 00:12:05.708 "reset": true, 00:12:05.708 "compare": false, 00:12:05.708 "compare_and_write": false, 00:12:05.708 "abort": true, 00:12:05.708 "nvme_admin": false, 00:12:05.708 "nvme_io": false 00:12:05.708 }, 00:12:05.708 "memory_domains": [ 00:12:05.708 { 00:12:05.708 "dma_device_id": "system", 00:12:05.708 "dma_device_type": 1 00:12:05.708 }, 00:12:05.708 { 00:12:05.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.708 "dma_device_type": 2 00:12:05.708 } 00:12:05.708 ], 00:12:05.708 "driver_specific": {} 00:12:05.708 } 00:12:05.708 ] 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:05.708 "name": "Existed_Raid", 00:12:05.708 "uuid": "e0c408d3-400f-4228-9397-758283c6a19e", 00:12:05.708 "strip_size_kb": 64, 00:12:05.708 "state": "configuring", 00:12:05.708 "raid_level": "raid0", 00:12:05.708 "superblock": true, 00:12:05.708 "num_base_bdevs": 3, 00:12:05.708 "num_base_bdevs_discovered": 1, 00:12:05.708 "num_base_bdevs_operational": 3, 00:12:05.708 "base_bdevs_list": [ 00:12:05.708 { 00:12:05.708 "name": "BaseBdev1", 00:12:05.708 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:05.708 "is_configured": true, 00:12:05.708 "data_offset": 2048, 00:12:05.708 "data_size": 63488 00:12:05.708 }, 00:12:05.708 { 00:12:05.708 "name": "BaseBdev2", 00:12:05.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.708 "is_configured": false, 00:12:05.708 "data_offset": 0, 00:12:05.708 "data_size": 0 00:12:05.708 }, 00:12:05.708 { 00:12:05.708 "name": "BaseBdev3", 00:12:05.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.708 "is_configured": false, 00:12:05.708 "data_offset": 0, 00:12:05.708 "data_size": 0 00:12:05.708 } 00:12:05.708 ] 00:12:05.708 }' 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:05.708 11:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:06.274 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:06.532 [2024-05-14 11:48:33.506900] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:06.532 [2024-05-14 11:48:33.506944] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b35ff0 name Existed_Raid, state configuring 00:12:06.532 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:06.790 [2024-05-14 11:48:33.751600] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:06.790 [2024-05-14 11:48:33.753152] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:06.791 [2024-05-14 11:48:33.753187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:06.791 [2024-05-14 11:48:33.753199] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:06.791 [2024-05-14 11:48:33.753210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.791 11:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.049 11:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:07.049 "name": "Existed_Raid", 00:12:07.049 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:07.049 "strip_size_kb": 64, 00:12:07.049 "state": "configuring", 00:12:07.049 "raid_level": "raid0", 00:12:07.049 "superblock": true, 00:12:07.049 "num_base_bdevs": 3, 00:12:07.049 "num_base_bdevs_discovered": 1, 00:12:07.049 "num_base_bdevs_operational": 3, 00:12:07.049 "base_bdevs_list": [ 00:12:07.049 { 00:12:07.049 "name": "BaseBdev1", 00:12:07.049 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:07.049 "is_configured": true, 00:12:07.049 "data_offset": 2048, 00:12:07.049 "data_size": 63488 00:12:07.049 }, 00:12:07.049 { 00:12:07.049 "name": "BaseBdev2", 00:12:07.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.049 "is_configured": false, 00:12:07.049 "data_offset": 0, 00:12:07.049 "data_size": 0 00:12:07.049 }, 00:12:07.049 { 00:12:07.049 "name": "BaseBdev3", 00:12:07.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.049 "is_configured": false, 00:12:07.049 "data_offset": 0, 00:12:07.049 "data_size": 0 00:12:07.049 } 00:12:07.049 ] 00:12:07.049 }' 00:12:07.049 11:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:07.049 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.615 11:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:07.873 [2024-05-14 11:48:34.833892] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.873 BaseBdev2 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:07.873 11:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:08.132 11:48:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:08.391 [ 00:12:08.391 { 00:12:08.391 "name": "BaseBdev2", 00:12:08.391 "aliases": [ 00:12:08.391 "0b5ae941-c0db-40ab-9d7a-87314426f8c1" 00:12:08.391 ], 00:12:08.391 "product_name": "Malloc disk", 00:12:08.391 "block_size": 512, 00:12:08.391 "num_blocks": 65536, 00:12:08.391 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:08.391 "assigned_rate_limits": { 00:12:08.391 "rw_ios_per_sec": 0, 00:12:08.391 "rw_mbytes_per_sec": 0, 00:12:08.391 "r_mbytes_per_sec": 0, 00:12:08.391 "w_mbytes_per_sec": 0 00:12:08.391 }, 00:12:08.391 "claimed": true, 00:12:08.391 "claim_type": "exclusive_write", 00:12:08.391 "zoned": false, 00:12:08.391 "supported_io_types": { 00:12:08.391 "read": true, 00:12:08.391 "write": true, 00:12:08.391 "unmap": true, 00:12:08.391 "write_zeroes": true, 00:12:08.391 "flush": true, 00:12:08.391 "reset": true, 00:12:08.391 "compare": false, 00:12:08.391 "compare_and_write": false, 00:12:08.391 "abort": true, 00:12:08.391 "nvme_admin": false, 00:12:08.391 "nvme_io": false 00:12:08.391 }, 00:12:08.391 "memory_domains": [ 00:12:08.391 { 00:12:08.391 "dma_device_id": "system", 00:12:08.391 "dma_device_type": 1 00:12:08.391 }, 00:12:08.391 { 00:12:08.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.391 "dma_device_type": 2 00:12:08.391 } 00:12:08.391 ], 00:12:08.391 "driver_specific": {} 00:12:08.391 } 00:12:08.391 ] 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.391 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.649 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:08.649 "name": "Existed_Raid", 00:12:08.649 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:08.649 "strip_size_kb": 64, 00:12:08.649 "state": "configuring", 00:12:08.649 "raid_level": "raid0", 00:12:08.649 "superblock": true, 00:12:08.649 "num_base_bdevs": 3, 00:12:08.649 "num_base_bdevs_discovered": 2, 00:12:08.649 "num_base_bdevs_operational": 3, 00:12:08.649 "base_bdevs_list": [ 00:12:08.649 { 00:12:08.649 "name": "BaseBdev1", 00:12:08.649 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:08.649 "is_configured": true, 00:12:08.649 "data_offset": 2048, 00:12:08.649 "data_size": 63488 00:12:08.649 }, 00:12:08.649 { 00:12:08.649 "name": "BaseBdev2", 00:12:08.649 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:08.649 "is_configured": true, 00:12:08.649 "data_offset": 2048, 00:12:08.649 "data_size": 63488 00:12:08.649 }, 00:12:08.649 { 00:12:08.649 "name": "BaseBdev3", 00:12:08.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.649 "is_configured": false, 00:12:08.649 "data_offset": 0, 00:12:08.649 "data_size": 0 00:12:08.649 } 00:12:08.649 ] 00:12:08.649 }' 00:12:08.649 11:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:08.649 11:48:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:09.225 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:09.489 [2024-05-14 11:48:36.325231] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:09.489 [2024-05-14 11:48:36.325411] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b37080 00:12:09.489 [2024-05-14 11:48:36.325426] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:09.489 [2024-05-14 11:48:36.325603] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b36d50 00:12:09.489 [2024-05-14 11:48:36.325731] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b37080 00:12:09.489 [2024-05-14 11:48:36.325742] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b37080 00:12:09.489 [2024-05-14 11:48:36.325832] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.489 BaseBdev3 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:09.489 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:09.747 [ 00:12:09.747 { 00:12:09.747 "name": "BaseBdev3", 00:12:09.747 "aliases": [ 00:12:09.747 "e2f8833a-43bc-4a81-9d22-49fc7b648c51" 00:12:09.747 ], 00:12:09.747 "product_name": "Malloc disk", 00:12:09.747 "block_size": 512, 00:12:09.748 "num_blocks": 65536, 00:12:09.748 "uuid": "e2f8833a-43bc-4a81-9d22-49fc7b648c51", 00:12:09.748 "assigned_rate_limits": { 00:12:09.748 "rw_ios_per_sec": 0, 00:12:09.748 "rw_mbytes_per_sec": 0, 00:12:09.748 "r_mbytes_per_sec": 0, 00:12:09.748 "w_mbytes_per_sec": 0 00:12:09.748 }, 00:12:09.748 "claimed": true, 00:12:09.748 "claim_type": "exclusive_write", 00:12:09.748 "zoned": false, 00:12:09.748 "supported_io_types": { 00:12:09.748 "read": true, 00:12:09.748 "write": true, 00:12:09.748 "unmap": true, 00:12:09.748 "write_zeroes": true, 00:12:09.748 "flush": true, 00:12:09.748 "reset": true, 00:12:09.748 "compare": false, 00:12:09.748 "compare_and_write": false, 00:12:09.748 "abort": true, 00:12:09.748 "nvme_admin": false, 00:12:09.748 "nvme_io": false 00:12:09.748 }, 00:12:09.748 "memory_domains": [ 00:12:09.748 { 00:12:09.748 "dma_device_id": "system", 00:12:09.748 "dma_device_type": 1 00:12:09.748 }, 00:12:09.748 { 00:12:09.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.748 "dma_device_type": 2 00:12:09.748 } 00:12:09.748 ], 00:12:09.748 "driver_specific": {} 00:12:09.748 } 00:12:09.748 ] 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.748 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:10.006 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:10.006 "name": "Existed_Raid", 00:12:10.006 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:10.006 "strip_size_kb": 64, 00:12:10.006 "state": "online", 00:12:10.006 "raid_level": "raid0", 00:12:10.007 "superblock": true, 00:12:10.007 "num_base_bdevs": 3, 00:12:10.007 "num_base_bdevs_discovered": 3, 00:12:10.007 "num_base_bdevs_operational": 3, 00:12:10.007 "base_bdevs_list": [ 00:12:10.007 { 00:12:10.007 "name": "BaseBdev1", 00:12:10.007 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:10.007 "is_configured": true, 00:12:10.007 "data_offset": 2048, 00:12:10.007 "data_size": 63488 00:12:10.007 }, 00:12:10.007 { 00:12:10.007 "name": "BaseBdev2", 00:12:10.007 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:10.007 "is_configured": true, 00:12:10.007 "data_offset": 2048, 00:12:10.007 "data_size": 63488 00:12:10.007 }, 00:12:10.007 { 00:12:10.007 "name": "BaseBdev3", 00:12:10.007 "uuid": "e2f8833a-43bc-4a81-9d22-49fc7b648c51", 00:12:10.007 "is_configured": true, 00:12:10.007 "data_offset": 2048, 00:12:10.007 "data_size": 63488 00:12:10.007 } 00:12:10.007 ] 00:12:10.007 }' 00:12:10.007 11:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:10.007 11:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:10.573 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:10.832 [2024-05-14 11:48:37.753293] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:10.832 "name": "Existed_Raid", 00:12:10.832 "aliases": [ 00:12:10.832 "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508" 00:12:10.832 ], 00:12:10.832 "product_name": "Raid Volume", 00:12:10.832 "block_size": 512, 00:12:10.832 "num_blocks": 190464, 00:12:10.832 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:10.832 "assigned_rate_limits": { 00:12:10.832 "rw_ios_per_sec": 0, 00:12:10.832 "rw_mbytes_per_sec": 0, 00:12:10.832 "r_mbytes_per_sec": 0, 00:12:10.832 "w_mbytes_per_sec": 0 00:12:10.832 }, 00:12:10.832 "claimed": false, 00:12:10.832 "zoned": false, 00:12:10.832 "supported_io_types": { 00:12:10.832 "read": true, 00:12:10.832 "write": true, 00:12:10.832 "unmap": true, 00:12:10.832 "write_zeroes": true, 00:12:10.832 "flush": true, 00:12:10.832 "reset": true, 00:12:10.832 "compare": false, 00:12:10.832 "compare_and_write": false, 00:12:10.832 "abort": false, 00:12:10.832 "nvme_admin": false, 00:12:10.832 "nvme_io": false 00:12:10.832 }, 00:12:10.832 "memory_domains": [ 00:12:10.832 { 00:12:10.832 "dma_device_id": "system", 00:12:10.832 "dma_device_type": 1 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.832 "dma_device_type": 2 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "dma_device_id": "system", 00:12:10.832 "dma_device_type": 1 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.832 "dma_device_type": 2 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "dma_device_id": "system", 00:12:10.832 "dma_device_type": 1 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.832 "dma_device_type": 2 00:12:10.832 } 00:12:10.832 ], 00:12:10.832 "driver_specific": { 00:12:10.832 "raid": { 00:12:10.832 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:10.832 "strip_size_kb": 64, 00:12:10.832 "state": "online", 00:12:10.832 "raid_level": "raid0", 00:12:10.832 "superblock": true, 00:12:10.832 "num_base_bdevs": 3, 00:12:10.832 "num_base_bdevs_discovered": 3, 00:12:10.832 "num_base_bdevs_operational": 3, 00:12:10.832 "base_bdevs_list": [ 00:12:10.832 { 00:12:10.832 "name": "BaseBdev1", 00:12:10.832 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:10.832 "is_configured": true, 00:12:10.832 "data_offset": 2048, 00:12:10.832 "data_size": 63488 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "name": "BaseBdev2", 00:12:10.832 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:10.832 "is_configured": true, 00:12:10.832 "data_offset": 2048, 00:12:10.832 "data_size": 63488 00:12:10.832 }, 00:12:10.832 { 00:12:10.832 "name": "BaseBdev3", 00:12:10.832 "uuid": "e2f8833a-43bc-4a81-9d22-49fc7b648c51", 00:12:10.832 "is_configured": true, 00:12:10.832 "data_offset": 2048, 00:12:10.832 "data_size": 63488 00:12:10.832 } 00:12:10.832 ] 00:12:10.832 } 00:12:10.832 } 00:12:10.832 }' 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:12:10.832 BaseBdev2 00:12:10.832 BaseBdev3' 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:10.832 11:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:11.090 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:11.090 "name": "BaseBdev1", 00:12:11.090 "aliases": [ 00:12:11.090 "6f928283-40bc-440f-946d-1aea6e98eb46" 00:12:11.090 ], 00:12:11.090 "product_name": "Malloc disk", 00:12:11.090 "block_size": 512, 00:12:11.090 "num_blocks": 65536, 00:12:11.090 "uuid": "6f928283-40bc-440f-946d-1aea6e98eb46", 00:12:11.090 "assigned_rate_limits": { 00:12:11.090 "rw_ios_per_sec": 0, 00:12:11.090 "rw_mbytes_per_sec": 0, 00:12:11.090 "r_mbytes_per_sec": 0, 00:12:11.090 "w_mbytes_per_sec": 0 00:12:11.090 }, 00:12:11.090 "claimed": true, 00:12:11.090 "claim_type": "exclusive_write", 00:12:11.090 "zoned": false, 00:12:11.090 "supported_io_types": { 00:12:11.090 "read": true, 00:12:11.090 "write": true, 00:12:11.090 "unmap": true, 00:12:11.090 "write_zeroes": true, 00:12:11.090 "flush": true, 00:12:11.090 "reset": true, 00:12:11.090 "compare": false, 00:12:11.090 "compare_and_write": false, 00:12:11.090 "abort": true, 00:12:11.090 "nvme_admin": false, 00:12:11.090 "nvme_io": false 00:12:11.090 }, 00:12:11.090 "memory_domains": [ 00:12:11.090 { 00:12:11.090 "dma_device_id": "system", 00:12:11.090 "dma_device_type": 1 00:12:11.090 }, 00:12:11.090 { 00:12:11.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.090 "dma_device_type": 2 00:12:11.090 } 00:12:11.090 ], 00:12:11.090 "driver_specific": {} 00:12:11.090 }' 00:12:11.090 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:11.090 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:11.090 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:11.090 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:11.349 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:11.607 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:11.607 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:11.607 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:11.607 "name": "BaseBdev2", 00:12:11.607 "aliases": [ 00:12:11.607 "0b5ae941-c0db-40ab-9d7a-87314426f8c1" 00:12:11.607 ], 00:12:11.607 "product_name": "Malloc disk", 00:12:11.607 "block_size": 512, 00:12:11.607 "num_blocks": 65536, 00:12:11.607 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:11.607 "assigned_rate_limits": { 00:12:11.607 "rw_ios_per_sec": 0, 00:12:11.607 "rw_mbytes_per_sec": 0, 00:12:11.607 "r_mbytes_per_sec": 0, 00:12:11.607 "w_mbytes_per_sec": 0 00:12:11.607 }, 00:12:11.607 "claimed": true, 00:12:11.607 "claim_type": "exclusive_write", 00:12:11.607 "zoned": false, 00:12:11.607 "supported_io_types": { 00:12:11.607 "read": true, 00:12:11.607 "write": true, 00:12:11.607 "unmap": true, 00:12:11.607 "write_zeroes": true, 00:12:11.607 "flush": true, 00:12:11.607 "reset": true, 00:12:11.607 "compare": false, 00:12:11.607 "compare_and_write": false, 00:12:11.607 "abort": true, 00:12:11.607 "nvme_admin": false, 00:12:11.607 "nvme_io": false 00:12:11.607 }, 00:12:11.607 "memory_domains": [ 00:12:11.607 { 00:12:11.607 "dma_device_id": "system", 00:12:11.607 "dma_device_type": 1 00:12:11.607 }, 00:12:11.607 { 00:12:11.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.607 "dma_device_type": 2 00:12:11.607 } 00:12:11.607 ], 00:12:11.607 "driver_specific": {} 00:12:11.607 }' 00:12:11.607 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.866 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.124 11:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.124 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:12.124 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:12.124 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:12.124 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:12.383 "name": "BaseBdev3", 00:12:12.383 "aliases": [ 00:12:12.383 "e2f8833a-43bc-4a81-9d22-49fc7b648c51" 00:12:12.383 ], 00:12:12.383 "product_name": "Malloc disk", 00:12:12.383 "block_size": 512, 00:12:12.383 "num_blocks": 65536, 00:12:12.383 "uuid": "e2f8833a-43bc-4a81-9d22-49fc7b648c51", 00:12:12.383 "assigned_rate_limits": { 00:12:12.383 "rw_ios_per_sec": 0, 00:12:12.383 "rw_mbytes_per_sec": 0, 00:12:12.383 "r_mbytes_per_sec": 0, 00:12:12.383 "w_mbytes_per_sec": 0 00:12:12.383 }, 00:12:12.383 "claimed": true, 00:12:12.383 "claim_type": "exclusive_write", 00:12:12.383 "zoned": false, 00:12:12.383 "supported_io_types": { 00:12:12.383 "read": true, 00:12:12.383 "write": true, 00:12:12.383 "unmap": true, 00:12:12.383 "write_zeroes": true, 00:12:12.383 "flush": true, 00:12:12.383 "reset": true, 00:12:12.383 "compare": false, 00:12:12.383 "compare_and_write": false, 00:12:12.383 "abort": true, 00:12:12.383 "nvme_admin": false, 00:12:12.383 "nvme_io": false 00:12:12.383 }, 00:12:12.383 "memory_domains": [ 00:12:12.383 { 00:12:12.383 "dma_device_id": "system", 00:12:12.383 "dma_device_type": 1 00:12:12.383 }, 00:12:12.383 { 00:12:12.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.383 "dma_device_type": 2 00:12:12.383 } 00:12:12.383 ], 00:12:12.383 "driver_specific": {} 00:12:12.383 }' 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:12.383 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:12.641 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:12.900 [2024-05-14 11:48:39.854851] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:12.900 [2024-05-14 11:48:39.854879] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.900 [2024-05-14 11:48:39.854922] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.900 11:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:13.158 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:13.158 "name": "Existed_Raid", 00:12:13.158 "uuid": "eb9040eb-48e6-4cd8-ac5c-dc4eea5b6508", 00:12:13.158 "strip_size_kb": 64, 00:12:13.158 "state": "offline", 00:12:13.158 "raid_level": "raid0", 00:12:13.158 "superblock": true, 00:12:13.158 "num_base_bdevs": 3, 00:12:13.158 "num_base_bdevs_discovered": 2, 00:12:13.158 "num_base_bdevs_operational": 2, 00:12:13.158 "base_bdevs_list": [ 00:12:13.158 { 00:12:13.158 "name": null, 00:12:13.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.158 "is_configured": false, 00:12:13.158 "data_offset": 2048, 00:12:13.158 "data_size": 63488 00:12:13.158 }, 00:12:13.158 { 00:12:13.158 "name": "BaseBdev2", 00:12:13.158 "uuid": "0b5ae941-c0db-40ab-9d7a-87314426f8c1", 00:12:13.158 "is_configured": true, 00:12:13.158 "data_offset": 2048, 00:12:13.158 "data_size": 63488 00:12:13.158 }, 00:12:13.158 { 00:12:13.158 "name": "BaseBdev3", 00:12:13.158 "uuid": "e2f8833a-43bc-4a81-9d22-49fc7b648c51", 00:12:13.159 "is_configured": true, 00:12:13.159 "data_offset": 2048, 00:12:13.159 "data_size": 63488 00:12:13.159 } 00:12:13.159 ] 00:12:13.159 }' 00:12:13.159 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:13.159 11:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:13.723 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:13.723 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:13.723 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.723 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:13.982 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:13.982 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:13.982 11:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:14.239 [2024-05-14 11:48:41.195451] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:14.239 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:14.239 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:14.239 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.239 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:14.498 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:14.498 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:14.498 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:14.755 [2024-05-14 11:48:41.687350] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:14.755 [2024-05-14 11:48:41.687404] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b37080 name Existed_Raid, state offline 00:12:14.755 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:14.755 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:14.755 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.755 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:15.014 11:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:15.272 BaseBdev2 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:15.272 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:15.530 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:15.789 [ 00:12:15.789 { 00:12:15.789 "name": "BaseBdev2", 00:12:15.789 "aliases": [ 00:12:15.789 "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61" 00:12:15.789 ], 00:12:15.789 "product_name": "Malloc disk", 00:12:15.789 "block_size": 512, 00:12:15.789 "num_blocks": 65536, 00:12:15.789 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:15.789 "assigned_rate_limits": { 00:12:15.789 "rw_ios_per_sec": 0, 00:12:15.789 "rw_mbytes_per_sec": 0, 00:12:15.789 "r_mbytes_per_sec": 0, 00:12:15.789 "w_mbytes_per_sec": 0 00:12:15.789 }, 00:12:15.789 "claimed": false, 00:12:15.789 "zoned": false, 00:12:15.789 "supported_io_types": { 00:12:15.789 "read": true, 00:12:15.789 "write": true, 00:12:15.789 "unmap": true, 00:12:15.789 "write_zeroes": true, 00:12:15.789 "flush": true, 00:12:15.789 "reset": true, 00:12:15.789 "compare": false, 00:12:15.789 "compare_and_write": false, 00:12:15.789 "abort": true, 00:12:15.789 "nvme_admin": false, 00:12:15.789 "nvme_io": false 00:12:15.789 }, 00:12:15.789 "memory_domains": [ 00:12:15.789 { 00:12:15.789 "dma_device_id": "system", 00:12:15.789 "dma_device_type": 1 00:12:15.789 }, 00:12:15.789 { 00:12:15.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.789 "dma_device_type": 2 00:12:15.789 } 00:12:15.789 ], 00:12:15.789 "driver_specific": {} 00:12:15.789 } 00:12:15.789 ] 00:12:15.789 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:15.789 11:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:15.789 11:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:15.789 11:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:16.047 BaseBdev3 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:16.047 11:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.306 11:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:16.565 [ 00:12:16.565 { 00:12:16.565 "name": "BaseBdev3", 00:12:16.565 "aliases": [ 00:12:16.565 "31ecaa00-9ad4-4164-8263-664ca487087a" 00:12:16.565 ], 00:12:16.565 "product_name": "Malloc disk", 00:12:16.565 "block_size": 512, 00:12:16.565 "num_blocks": 65536, 00:12:16.565 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:16.565 "assigned_rate_limits": { 00:12:16.565 "rw_ios_per_sec": 0, 00:12:16.565 "rw_mbytes_per_sec": 0, 00:12:16.565 "r_mbytes_per_sec": 0, 00:12:16.565 "w_mbytes_per_sec": 0 00:12:16.565 }, 00:12:16.565 "claimed": false, 00:12:16.565 "zoned": false, 00:12:16.565 "supported_io_types": { 00:12:16.565 "read": true, 00:12:16.565 "write": true, 00:12:16.565 "unmap": true, 00:12:16.565 "write_zeroes": true, 00:12:16.565 "flush": true, 00:12:16.565 "reset": true, 00:12:16.565 "compare": false, 00:12:16.565 "compare_and_write": false, 00:12:16.565 "abort": true, 00:12:16.565 "nvme_admin": false, 00:12:16.565 "nvme_io": false 00:12:16.565 }, 00:12:16.565 "memory_domains": [ 00:12:16.565 { 00:12:16.565 "dma_device_id": "system", 00:12:16.565 "dma_device_type": 1 00:12:16.565 }, 00:12:16.565 { 00:12:16.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.565 "dma_device_type": 2 00:12:16.565 } 00:12:16.565 ], 00:12:16.565 "driver_specific": {} 00:12:16.565 } 00:12:16.565 ] 00:12:16.565 11:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:16.565 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:16.565 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:16.565 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:16.565 [2024-05-14 11:48:43.629751] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:16.565 [2024-05-14 11:48:43.629797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:16.565 [2024-05-14 11:48:43.629817] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:16.565 [2024-05-14 11:48:43.631200] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:16.565 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:16.566 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:16.566 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:16.566 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:16.566 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:16.566 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:16.825 "name": "Existed_Raid", 00:12:16.825 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:16.825 "strip_size_kb": 64, 00:12:16.825 "state": "configuring", 00:12:16.825 "raid_level": "raid0", 00:12:16.825 "superblock": true, 00:12:16.825 "num_base_bdevs": 3, 00:12:16.825 "num_base_bdevs_discovered": 2, 00:12:16.825 "num_base_bdevs_operational": 3, 00:12:16.825 "base_bdevs_list": [ 00:12:16.825 { 00:12:16.825 "name": "BaseBdev1", 00:12:16.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.825 "is_configured": false, 00:12:16.825 "data_offset": 0, 00:12:16.825 "data_size": 0 00:12:16.825 }, 00:12:16.825 { 00:12:16.825 "name": "BaseBdev2", 00:12:16.825 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:16.825 "is_configured": true, 00:12:16.825 "data_offset": 2048, 00:12:16.825 "data_size": 63488 00:12:16.825 }, 00:12:16.825 { 00:12:16.825 "name": "BaseBdev3", 00:12:16.825 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:16.825 "is_configured": true, 00:12:16.825 "data_offset": 2048, 00:12:16.825 "data_size": 63488 00:12:16.825 } 00:12:16.825 ] 00:12:16.825 }' 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:16.825 11:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:17.391 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:17.649 [2024-05-14 11:48:44.672478] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.649 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.908 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:17.908 "name": "Existed_Raid", 00:12:17.908 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:17.908 "strip_size_kb": 64, 00:12:17.908 "state": "configuring", 00:12:17.908 "raid_level": "raid0", 00:12:17.908 "superblock": true, 00:12:17.908 "num_base_bdevs": 3, 00:12:17.908 "num_base_bdevs_discovered": 1, 00:12:17.908 "num_base_bdevs_operational": 3, 00:12:17.908 "base_bdevs_list": [ 00:12:17.908 { 00:12:17.908 "name": "BaseBdev1", 00:12:17.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.908 "is_configured": false, 00:12:17.908 "data_offset": 0, 00:12:17.908 "data_size": 0 00:12:17.908 }, 00:12:17.908 { 00:12:17.908 "name": null, 00:12:17.908 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:17.908 "is_configured": false, 00:12:17.908 "data_offset": 2048, 00:12:17.908 "data_size": 63488 00:12:17.908 }, 00:12:17.908 { 00:12:17.908 "name": "BaseBdev3", 00:12:17.908 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:17.908 "is_configured": true, 00:12:17.908 "data_offset": 2048, 00:12:17.908 "data_size": 63488 00:12:17.908 } 00:12:17.908 ] 00:12:17.908 }' 00:12:17.908 11:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:17.908 11:48:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:18.475 11:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.475 11:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:18.733 11:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:12:18.733 11:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:18.992 [2024-05-14 11:48:45.951314] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:18.992 BaseBdev1 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:18.992 11:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.251 11:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:19.510 [ 00:12:19.510 { 00:12:19.510 "name": "BaseBdev1", 00:12:19.510 "aliases": [ 00:12:19.510 "1b012eea-0225-4a67-8cb6-b402fcd2169b" 00:12:19.510 ], 00:12:19.510 "product_name": "Malloc disk", 00:12:19.510 "block_size": 512, 00:12:19.510 "num_blocks": 65536, 00:12:19.510 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:19.510 "assigned_rate_limits": { 00:12:19.510 "rw_ios_per_sec": 0, 00:12:19.510 "rw_mbytes_per_sec": 0, 00:12:19.510 "r_mbytes_per_sec": 0, 00:12:19.510 "w_mbytes_per_sec": 0 00:12:19.510 }, 00:12:19.510 "claimed": true, 00:12:19.510 "claim_type": "exclusive_write", 00:12:19.510 "zoned": false, 00:12:19.510 "supported_io_types": { 00:12:19.510 "read": true, 00:12:19.510 "write": true, 00:12:19.510 "unmap": true, 00:12:19.510 "write_zeroes": true, 00:12:19.510 "flush": true, 00:12:19.510 "reset": true, 00:12:19.510 "compare": false, 00:12:19.510 "compare_and_write": false, 00:12:19.510 "abort": true, 00:12:19.510 "nvme_admin": false, 00:12:19.510 "nvme_io": false 00:12:19.510 }, 00:12:19.510 "memory_domains": [ 00:12:19.510 { 00:12:19.510 "dma_device_id": "system", 00:12:19.510 "dma_device_type": 1 00:12:19.510 }, 00:12:19.510 { 00:12:19.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.510 "dma_device_type": 2 00:12:19.510 } 00:12:19.510 ], 00:12:19.510 "driver_specific": {} 00:12:19.510 } 00:12:19.510 ] 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.510 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.768 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:19.768 "name": "Existed_Raid", 00:12:19.768 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:19.768 "strip_size_kb": 64, 00:12:19.768 "state": "configuring", 00:12:19.768 "raid_level": "raid0", 00:12:19.768 "superblock": true, 00:12:19.768 "num_base_bdevs": 3, 00:12:19.768 "num_base_bdevs_discovered": 2, 00:12:19.768 "num_base_bdevs_operational": 3, 00:12:19.768 "base_bdevs_list": [ 00:12:19.768 { 00:12:19.768 "name": "BaseBdev1", 00:12:19.768 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:19.768 "is_configured": true, 00:12:19.768 "data_offset": 2048, 00:12:19.768 "data_size": 63488 00:12:19.768 }, 00:12:19.768 { 00:12:19.768 "name": null, 00:12:19.768 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:19.768 "is_configured": false, 00:12:19.768 "data_offset": 2048, 00:12:19.768 "data_size": 63488 00:12:19.768 }, 00:12:19.768 { 00:12:19.768 "name": "BaseBdev3", 00:12:19.768 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:19.768 "is_configured": true, 00:12:19.768 "data_offset": 2048, 00:12:19.768 "data_size": 63488 00:12:19.768 } 00:12:19.768 ] 00:12:19.768 }' 00:12:19.768 11:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:19.768 11:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:20.334 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.334 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:20.334 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:20.592 [2024-05-14 11:48:47.643841] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.592 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.849 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:20.849 "name": "Existed_Raid", 00:12:20.849 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:20.849 "strip_size_kb": 64, 00:12:20.849 "state": "configuring", 00:12:20.849 "raid_level": "raid0", 00:12:20.849 "superblock": true, 00:12:20.849 "num_base_bdevs": 3, 00:12:20.849 "num_base_bdevs_discovered": 1, 00:12:20.849 "num_base_bdevs_operational": 3, 00:12:20.849 "base_bdevs_list": [ 00:12:20.849 { 00:12:20.849 "name": "BaseBdev1", 00:12:20.849 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:20.849 "is_configured": true, 00:12:20.850 "data_offset": 2048, 00:12:20.850 "data_size": 63488 00:12:20.850 }, 00:12:20.850 { 00:12:20.850 "name": null, 00:12:20.850 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:20.850 "is_configured": false, 00:12:20.850 "data_offset": 2048, 00:12:20.850 "data_size": 63488 00:12:20.850 }, 00:12:20.850 { 00:12:20.850 "name": null, 00:12:20.850 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:20.850 "is_configured": false, 00:12:20.850 "data_offset": 2048, 00:12:20.850 "data_size": 63488 00:12:20.850 } 00:12:20.850 ] 00:12:20.850 }' 00:12:20.850 11:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:20.850 11:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:21.783 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.783 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:21.783 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:12:21.783 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:22.041 [2024-05-14 11:48:48.967364] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.041 11:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.300 11:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:22.300 "name": "Existed_Raid", 00:12:22.300 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:22.300 "strip_size_kb": 64, 00:12:22.300 "state": "configuring", 00:12:22.300 "raid_level": "raid0", 00:12:22.300 "superblock": true, 00:12:22.300 "num_base_bdevs": 3, 00:12:22.300 "num_base_bdevs_discovered": 2, 00:12:22.300 "num_base_bdevs_operational": 3, 00:12:22.300 "base_bdevs_list": [ 00:12:22.300 { 00:12:22.300 "name": "BaseBdev1", 00:12:22.300 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:22.300 "is_configured": true, 00:12:22.300 "data_offset": 2048, 00:12:22.300 "data_size": 63488 00:12:22.300 }, 00:12:22.300 { 00:12:22.300 "name": null, 00:12:22.300 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:22.300 "is_configured": false, 00:12:22.300 "data_offset": 2048, 00:12:22.300 "data_size": 63488 00:12:22.300 }, 00:12:22.300 { 00:12:22.300 "name": "BaseBdev3", 00:12:22.300 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:22.300 "is_configured": true, 00:12:22.300 "data_offset": 2048, 00:12:22.300 "data_size": 63488 00:12:22.300 } 00:12:22.300 ] 00:12:22.300 }' 00:12:22.300 11:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:22.300 11:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:22.866 11:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.866 11:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:23.124 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:12:23.124 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:23.382 [2024-05-14 11:48:50.278863] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.382 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.639 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:23.639 "name": "Existed_Raid", 00:12:23.639 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:23.639 "strip_size_kb": 64, 00:12:23.639 "state": "configuring", 00:12:23.639 "raid_level": "raid0", 00:12:23.639 "superblock": true, 00:12:23.639 "num_base_bdevs": 3, 00:12:23.639 "num_base_bdevs_discovered": 1, 00:12:23.639 "num_base_bdevs_operational": 3, 00:12:23.639 "base_bdevs_list": [ 00:12:23.639 { 00:12:23.639 "name": null, 00:12:23.639 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:23.639 "is_configured": false, 00:12:23.639 "data_offset": 2048, 00:12:23.639 "data_size": 63488 00:12:23.639 }, 00:12:23.639 { 00:12:23.639 "name": null, 00:12:23.639 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:23.639 "is_configured": false, 00:12:23.639 "data_offset": 2048, 00:12:23.639 "data_size": 63488 00:12:23.639 }, 00:12:23.639 { 00:12:23.639 "name": "BaseBdev3", 00:12:23.639 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:23.639 "is_configured": true, 00:12:23.639 "data_offset": 2048, 00:12:23.639 "data_size": 63488 00:12:23.639 } 00:12:23.639 ] 00:12:23.639 }' 00:12:23.640 11:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:23.640 11:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.205 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:24.206 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.463 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:12:24.463 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:24.721 [2024-05-14 11:48:51.628947] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.721 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.979 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:24.979 "name": "Existed_Raid", 00:12:24.979 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:24.979 "strip_size_kb": 64, 00:12:24.979 "state": "configuring", 00:12:24.979 "raid_level": "raid0", 00:12:24.979 "superblock": true, 00:12:24.979 "num_base_bdevs": 3, 00:12:24.979 "num_base_bdevs_discovered": 2, 00:12:24.979 "num_base_bdevs_operational": 3, 00:12:24.979 "base_bdevs_list": [ 00:12:24.979 { 00:12:24.979 "name": null, 00:12:24.979 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:24.979 "is_configured": false, 00:12:24.979 "data_offset": 2048, 00:12:24.979 "data_size": 63488 00:12:24.979 }, 00:12:24.979 { 00:12:24.979 "name": "BaseBdev2", 00:12:24.979 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:24.979 "is_configured": true, 00:12:24.979 "data_offset": 2048, 00:12:24.979 "data_size": 63488 00:12:24.979 }, 00:12:24.979 { 00:12:24.979 "name": "BaseBdev3", 00:12:24.979 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:24.979 "is_configured": true, 00:12:24.979 "data_offset": 2048, 00:12:24.979 "data_size": 63488 00:12:24.979 } 00:12:24.979 ] 00:12:24.979 }' 00:12:24.979 11:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:24.979 11:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:25.575 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:25.575 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.846 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:12:25.846 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.846 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:26.114 11:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1b012eea-0225-4a67-8cb6-b402fcd2169b 00:12:26.383 [2024-05-14 11:48:53.208651] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:26.383 [2024-05-14 11:48:53.208810] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cdc230 00:12:26.383 [2024-05-14 11:48:53.208824] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:26.383 [2024-05-14 11:48:53.209005] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ce92f0 00:12:26.383 [2024-05-14 11:48:53.209134] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cdc230 00:12:26.383 [2024-05-14 11:48:53.209143] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cdc230 00:12:26.383 [2024-05-14 11:48:53.209238] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.383 NewBaseBdev 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:26.383 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:26.640 [ 00:12:26.640 { 00:12:26.640 "name": "NewBaseBdev", 00:12:26.640 "aliases": [ 00:12:26.640 "1b012eea-0225-4a67-8cb6-b402fcd2169b" 00:12:26.640 ], 00:12:26.640 "product_name": "Malloc disk", 00:12:26.640 "block_size": 512, 00:12:26.640 "num_blocks": 65536, 00:12:26.640 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:26.640 "assigned_rate_limits": { 00:12:26.640 "rw_ios_per_sec": 0, 00:12:26.640 "rw_mbytes_per_sec": 0, 00:12:26.640 "r_mbytes_per_sec": 0, 00:12:26.640 "w_mbytes_per_sec": 0 00:12:26.640 }, 00:12:26.640 "claimed": true, 00:12:26.640 "claim_type": "exclusive_write", 00:12:26.640 "zoned": false, 00:12:26.640 "supported_io_types": { 00:12:26.640 "read": true, 00:12:26.640 "write": true, 00:12:26.640 "unmap": true, 00:12:26.640 "write_zeroes": true, 00:12:26.640 "flush": true, 00:12:26.640 "reset": true, 00:12:26.640 "compare": false, 00:12:26.640 "compare_and_write": false, 00:12:26.640 "abort": true, 00:12:26.640 "nvme_admin": false, 00:12:26.640 "nvme_io": false 00:12:26.640 }, 00:12:26.640 "memory_domains": [ 00:12:26.640 { 00:12:26.640 "dma_device_id": "system", 00:12:26.640 "dma_device_type": 1 00:12:26.640 }, 00:12:26.640 { 00:12:26.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.640 "dma_device_type": 2 00:12:26.640 } 00:12:26.640 ], 00:12:26.640 "driver_specific": {} 00:12:26.640 } 00:12:26.640 ] 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:26.640 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.641 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.905 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:26.905 "name": "Existed_Raid", 00:12:26.905 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:26.905 "strip_size_kb": 64, 00:12:26.905 "state": "online", 00:12:26.905 "raid_level": "raid0", 00:12:26.905 "superblock": true, 00:12:26.905 "num_base_bdevs": 3, 00:12:26.905 "num_base_bdevs_discovered": 3, 00:12:26.906 "num_base_bdevs_operational": 3, 00:12:26.906 "base_bdevs_list": [ 00:12:26.906 { 00:12:26.906 "name": "NewBaseBdev", 00:12:26.906 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:26.906 "is_configured": true, 00:12:26.906 "data_offset": 2048, 00:12:26.906 "data_size": 63488 00:12:26.906 }, 00:12:26.906 { 00:12:26.906 "name": "BaseBdev2", 00:12:26.906 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:26.906 "is_configured": true, 00:12:26.906 "data_offset": 2048, 00:12:26.906 "data_size": 63488 00:12:26.906 }, 00:12:26.906 { 00:12:26.906 "name": "BaseBdev3", 00:12:26.906 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:26.906 "is_configured": true, 00:12:26.906 "data_offset": 2048, 00:12:26.906 "data_size": 63488 00:12:26.906 } 00:12:26.906 ] 00:12:26.906 }' 00:12:26.906 11:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:26.906 11:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:27.477 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:27.735 [2024-05-14 11:48:54.761047] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:27.735 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:27.735 "name": "Existed_Raid", 00:12:27.735 "aliases": [ 00:12:27.735 "edf1b99b-afc5-4bf1-923b-7cd80cda9dee" 00:12:27.735 ], 00:12:27.735 "product_name": "Raid Volume", 00:12:27.735 "block_size": 512, 00:12:27.735 "num_blocks": 190464, 00:12:27.735 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:27.735 "assigned_rate_limits": { 00:12:27.735 "rw_ios_per_sec": 0, 00:12:27.735 "rw_mbytes_per_sec": 0, 00:12:27.735 "r_mbytes_per_sec": 0, 00:12:27.735 "w_mbytes_per_sec": 0 00:12:27.735 }, 00:12:27.735 "claimed": false, 00:12:27.735 "zoned": false, 00:12:27.735 "supported_io_types": { 00:12:27.735 "read": true, 00:12:27.735 "write": true, 00:12:27.735 "unmap": true, 00:12:27.735 "write_zeroes": true, 00:12:27.735 "flush": true, 00:12:27.735 "reset": true, 00:12:27.735 "compare": false, 00:12:27.735 "compare_and_write": false, 00:12:27.735 "abort": false, 00:12:27.735 "nvme_admin": false, 00:12:27.735 "nvme_io": false 00:12:27.735 }, 00:12:27.735 "memory_domains": [ 00:12:27.735 { 00:12:27.735 "dma_device_id": "system", 00:12:27.735 "dma_device_type": 1 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.735 "dma_device_type": 2 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "dma_device_id": "system", 00:12:27.735 "dma_device_type": 1 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.735 "dma_device_type": 2 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "dma_device_id": "system", 00:12:27.735 "dma_device_type": 1 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.735 "dma_device_type": 2 00:12:27.735 } 00:12:27.735 ], 00:12:27.735 "driver_specific": { 00:12:27.735 "raid": { 00:12:27.735 "uuid": "edf1b99b-afc5-4bf1-923b-7cd80cda9dee", 00:12:27.735 "strip_size_kb": 64, 00:12:27.735 "state": "online", 00:12:27.735 "raid_level": "raid0", 00:12:27.735 "superblock": true, 00:12:27.735 "num_base_bdevs": 3, 00:12:27.735 "num_base_bdevs_discovered": 3, 00:12:27.735 "num_base_bdevs_operational": 3, 00:12:27.735 "base_bdevs_list": [ 00:12:27.735 { 00:12:27.735 "name": "NewBaseBdev", 00:12:27.735 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:27.735 "is_configured": true, 00:12:27.735 "data_offset": 2048, 00:12:27.735 "data_size": 63488 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "name": "BaseBdev2", 00:12:27.735 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:27.735 "is_configured": true, 00:12:27.735 "data_offset": 2048, 00:12:27.735 "data_size": 63488 00:12:27.735 }, 00:12:27.735 { 00:12:27.735 "name": "BaseBdev3", 00:12:27.735 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:27.735 "is_configured": true, 00:12:27.735 "data_offset": 2048, 00:12:27.735 "data_size": 63488 00:12:27.735 } 00:12:27.735 ] 00:12:27.735 } 00:12:27.735 } 00:12:27.735 }' 00:12:27.735 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:27.994 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:27.994 BaseBdev2 00:12:27.994 BaseBdev3' 00:12:27.994 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:27.994 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:27.994 11:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:27.994 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:27.995 "name": "NewBaseBdev", 00:12:27.995 "aliases": [ 00:12:27.995 "1b012eea-0225-4a67-8cb6-b402fcd2169b" 00:12:27.995 ], 00:12:27.995 "product_name": "Malloc disk", 00:12:27.995 "block_size": 512, 00:12:27.995 "num_blocks": 65536, 00:12:27.995 "uuid": "1b012eea-0225-4a67-8cb6-b402fcd2169b", 00:12:27.995 "assigned_rate_limits": { 00:12:27.995 "rw_ios_per_sec": 0, 00:12:27.995 "rw_mbytes_per_sec": 0, 00:12:27.995 "r_mbytes_per_sec": 0, 00:12:27.995 "w_mbytes_per_sec": 0 00:12:27.995 }, 00:12:27.995 "claimed": true, 00:12:27.995 "claim_type": "exclusive_write", 00:12:27.995 "zoned": false, 00:12:27.995 "supported_io_types": { 00:12:27.995 "read": true, 00:12:27.995 "write": true, 00:12:27.995 "unmap": true, 00:12:27.995 "write_zeroes": true, 00:12:27.995 "flush": true, 00:12:27.995 "reset": true, 00:12:27.995 "compare": false, 00:12:27.995 "compare_and_write": false, 00:12:27.995 "abort": true, 00:12:27.995 "nvme_admin": false, 00:12:27.995 "nvme_io": false 00:12:27.995 }, 00:12:27.995 "memory_domains": [ 00:12:27.995 { 00:12:27.995 "dma_device_id": "system", 00:12:27.995 "dma_device_type": 1 00:12:27.995 }, 00:12:27.995 { 00:12:27.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.995 "dma_device_type": 2 00:12:27.995 } 00:12:27.995 ], 00:12:27.995 "driver_specific": {} 00:12:27.995 }' 00:12:27.995 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:28.252 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:28.510 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:28.510 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:28.510 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:28.510 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:28.510 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:28.768 "name": "BaseBdev2", 00:12:28.768 "aliases": [ 00:12:28.768 "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61" 00:12:28.768 ], 00:12:28.768 "product_name": "Malloc disk", 00:12:28.768 "block_size": 512, 00:12:28.768 "num_blocks": 65536, 00:12:28.768 "uuid": "ca94b490-7bcf-4644-a5a5-e8ac9cf64b61", 00:12:28.768 "assigned_rate_limits": { 00:12:28.768 "rw_ios_per_sec": 0, 00:12:28.768 "rw_mbytes_per_sec": 0, 00:12:28.768 "r_mbytes_per_sec": 0, 00:12:28.768 "w_mbytes_per_sec": 0 00:12:28.768 }, 00:12:28.768 "claimed": true, 00:12:28.768 "claim_type": "exclusive_write", 00:12:28.768 "zoned": false, 00:12:28.768 "supported_io_types": { 00:12:28.768 "read": true, 00:12:28.768 "write": true, 00:12:28.768 "unmap": true, 00:12:28.768 "write_zeroes": true, 00:12:28.768 "flush": true, 00:12:28.768 "reset": true, 00:12:28.768 "compare": false, 00:12:28.768 "compare_and_write": false, 00:12:28.768 "abort": true, 00:12:28.768 "nvme_admin": false, 00:12:28.768 "nvme_io": false 00:12:28.768 }, 00:12:28.768 "memory_domains": [ 00:12:28.768 { 00:12:28.768 "dma_device_id": "system", 00:12:28.768 "dma_device_type": 1 00:12:28.768 }, 00:12:28.768 { 00:12:28.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.768 "dma_device_type": 2 00:12:28.768 } 00:12:28.768 ], 00:12:28.768 "driver_specific": {} 00:12:28.768 }' 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.768 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.026 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.026 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.026 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.026 11:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.026 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:29.026 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:29.026 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:29.026 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:29.284 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:29.284 "name": "BaseBdev3", 00:12:29.284 "aliases": [ 00:12:29.284 "31ecaa00-9ad4-4164-8263-664ca487087a" 00:12:29.284 ], 00:12:29.284 "product_name": "Malloc disk", 00:12:29.284 "block_size": 512, 00:12:29.284 "num_blocks": 65536, 00:12:29.284 "uuid": "31ecaa00-9ad4-4164-8263-664ca487087a", 00:12:29.284 "assigned_rate_limits": { 00:12:29.284 "rw_ios_per_sec": 0, 00:12:29.284 "rw_mbytes_per_sec": 0, 00:12:29.284 "r_mbytes_per_sec": 0, 00:12:29.284 "w_mbytes_per_sec": 0 00:12:29.284 }, 00:12:29.284 "claimed": true, 00:12:29.284 "claim_type": "exclusive_write", 00:12:29.284 "zoned": false, 00:12:29.284 "supported_io_types": { 00:12:29.284 "read": true, 00:12:29.284 "write": true, 00:12:29.284 "unmap": true, 00:12:29.284 "write_zeroes": true, 00:12:29.284 "flush": true, 00:12:29.284 "reset": true, 00:12:29.284 "compare": false, 00:12:29.284 "compare_and_write": false, 00:12:29.284 "abort": true, 00:12:29.284 "nvme_admin": false, 00:12:29.284 "nvme_io": false 00:12:29.284 }, 00:12:29.284 "memory_domains": [ 00:12:29.284 { 00:12:29.284 "dma_device_id": "system", 00:12:29.284 "dma_device_type": 1 00:12:29.284 }, 00:12:29.284 { 00:12:29.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.284 "dma_device_type": 2 00:12:29.284 } 00:12:29.284 ], 00:12:29.284 "driver_specific": {} 00:12:29.284 }' 00:12:29.284 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.284 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:29.284 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:29.284 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:29.541 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:29.541 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:29.541 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.541 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:29.542 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.542 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.542 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:29.542 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:29.542 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:29.799 [2024-05-14 11:48:56.806218] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:29.799 [2024-05-14 11:48:56.806248] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.799 [2024-05-14 11:48:56.806309] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.800 [2024-05-14 11:48:56.806363] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.800 [2024-05-14 11:48:56.806375] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cdc230 name Existed_Raid, state offline 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1683250 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1683250 ']' 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1683250 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1683250 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1683250' 00:12:29.800 killing process with pid 1683250 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1683250 00:12:29.800 [2024-05-14 11:48:56.873988] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:29.800 11:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1683250 00:12:30.058 [2024-05-14 11:48:56.904905] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:30.058 11:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:12:30.058 00:12:30.058 real 0m27.630s 00:12:30.058 user 0m50.641s 00:12:30.058 sys 0m4.967s 00:12:30.058 11:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:30.058 11:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.058 ************************************ 00:12:30.058 END TEST raid_state_function_test_sb 00:12:30.058 ************************************ 00:12:30.316 11:48:57 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:12:30.316 11:48:57 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:12:30.316 11:48:57 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:30.316 11:48:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:30.316 ************************************ 00:12:30.316 START TEST raid_superblock_test 00:12:30.316 ************************************ 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 3 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1687421 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1687421 /var/tmp/spdk-raid.sock 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1687421 ']' 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:30.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:30.316 11:48:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.316 [2024-05-14 11:48:57.285088] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:12:30.316 [2024-05-14 11:48:57.285153] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1687421 ] 00:12:30.575 [2024-05-14 11:48:57.413187] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:30.575 [2024-05-14 11:48:57.519987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:30.575 [2024-05-14 11:48:57.588382] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.575 [2024-05-14 11:48:57.588436] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:31.140 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:31.397 malloc1 00:12:31.397 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:31.655 [2024-05-14 11:48:58.683894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:31.655 [2024-05-14 11:48:58.683942] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:31.655 [2024-05-14 11:48:58.683966] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26fb2a0 00:12:31.655 [2024-05-14 11:48:58.683978] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:31.655 [2024-05-14 11:48:58.685763] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:31.655 [2024-05-14 11:48:58.685791] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:31.655 pt1 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:31.655 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:31.914 malloc2 00:12:31.914 11:48:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:32.173 [2024-05-14 11:48:59.179334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:32.173 [2024-05-14 11:48:59.179381] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:32.173 [2024-05-14 11:48:59.179419] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28ae480 00:12:32.173 [2024-05-14 11:48:59.179434] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:32.173 [2024-05-14 11:48:59.180987] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:32.173 [2024-05-14 11:48:59.181014] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:32.173 pt2 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:32.173 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:32.430 malloc3 00:12:32.430 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:32.687 [2024-05-14 11:48:59.657416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:32.687 [2024-05-14 11:48:59.657473] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:32.687 [2024-05-14 11:48:59.657494] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f4e80 00:12:32.687 [2024-05-14 11:48:59.657507] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:32.687 [2024-05-14 11:48:59.659107] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:32.687 [2024-05-14 11:48:59.659135] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:32.687 pt3 00:12:32.687 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:32.687 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:32.687 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:32.945 [2024-05-14 11:48:59.898068] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:32.945 [2024-05-14 11:48:59.899455] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:32.945 [2024-05-14 11:48:59.899511] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:32.945 [2024-05-14 11:48:59.899671] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x26f6dc0 00:12:32.945 [2024-05-14 11:48:59.899683] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:32.945 [2024-05-14 11:48:59.899886] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26fb900 00:12:32.945 [2024-05-14 11:48:59.900034] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26f6dc0 00:12:32.945 [2024-05-14 11:48:59.900045] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26f6dc0 00:12:32.945 [2024-05-14 11:48:59.900150] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.945 11:48:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:33.203 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:33.203 "name": "raid_bdev1", 00:12:33.203 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:33.203 "strip_size_kb": 64, 00:12:33.203 "state": "online", 00:12:33.203 "raid_level": "raid0", 00:12:33.203 "superblock": true, 00:12:33.203 "num_base_bdevs": 3, 00:12:33.203 "num_base_bdevs_discovered": 3, 00:12:33.203 "num_base_bdevs_operational": 3, 00:12:33.203 "base_bdevs_list": [ 00:12:33.203 { 00:12:33.203 "name": "pt1", 00:12:33.203 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:33.203 "is_configured": true, 00:12:33.203 "data_offset": 2048, 00:12:33.203 "data_size": 63488 00:12:33.203 }, 00:12:33.203 { 00:12:33.203 "name": "pt2", 00:12:33.203 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:33.203 "is_configured": true, 00:12:33.203 "data_offset": 2048, 00:12:33.203 "data_size": 63488 00:12:33.203 }, 00:12:33.203 { 00:12:33.203 "name": "pt3", 00:12:33.203 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:33.203 "is_configured": true, 00:12:33.203 "data_offset": 2048, 00:12:33.203 "data_size": 63488 00:12:33.203 } 00:12:33.203 ] 00:12:33.203 }' 00:12:33.203 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:33.203 11:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:33.769 11:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:34.027 [2024-05-14 11:49:00.985159] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:34.027 "name": "raid_bdev1", 00:12:34.027 "aliases": [ 00:12:34.027 "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7" 00:12:34.027 ], 00:12:34.027 "product_name": "Raid Volume", 00:12:34.027 "block_size": 512, 00:12:34.027 "num_blocks": 190464, 00:12:34.027 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:34.027 "assigned_rate_limits": { 00:12:34.027 "rw_ios_per_sec": 0, 00:12:34.027 "rw_mbytes_per_sec": 0, 00:12:34.027 "r_mbytes_per_sec": 0, 00:12:34.027 "w_mbytes_per_sec": 0 00:12:34.027 }, 00:12:34.027 "claimed": false, 00:12:34.027 "zoned": false, 00:12:34.027 "supported_io_types": { 00:12:34.027 "read": true, 00:12:34.027 "write": true, 00:12:34.027 "unmap": true, 00:12:34.027 "write_zeroes": true, 00:12:34.027 "flush": true, 00:12:34.027 "reset": true, 00:12:34.027 "compare": false, 00:12:34.027 "compare_and_write": false, 00:12:34.027 "abort": false, 00:12:34.027 "nvme_admin": false, 00:12:34.027 "nvme_io": false 00:12:34.027 }, 00:12:34.027 "memory_domains": [ 00:12:34.027 { 00:12:34.027 "dma_device_id": "system", 00:12:34.027 "dma_device_type": 1 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.027 "dma_device_type": 2 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "dma_device_id": "system", 00:12:34.027 "dma_device_type": 1 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.027 "dma_device_type": 2 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "dma_device_id": "system", 00:12:34.027 "dma_device_type": 1 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.027 "dma_device_type": 2 00:12:34.027 } 00:12:34.027 ], 00:12:34.027 "driver_specific": { 00:12:34.027 "raid": { 00:12:34.027 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:34.027 "strip_size_kb": 64, 00:12:34.027 "state": "online", 00:12:34.027 "raid_level": "raid0", 00:12:34.027 "superblock": true, 00:12:34.027 "num_base_bdevs": 3, 00:12:34.027 "num_base_bdevs_discovered": 3, 00:12:34.027 "num_base_bdevs_operational": 3, 00:12:34.027 "base_bdevs_list": [ 00:12:34.027 { 00:12:34.027 "name": "pt1", 00:12:34.027 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:34.027 "is_configured": true, 00:12:34.027 "data_offset": 2048, 00:12:34.027 "data_size": 63488 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "name": "pt2", 00:12:34.027 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:34.027 "is_configured": true, 00:12:34.027 "data_offset": 2048, 00:12:34.027 "data_size": 63488 00:12:34.027 }, 00:12:34.027 { 00:12:34.027 "name": "pt3", 00:12:34.027 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:34.027 "is_configured": true, 00:12:34.027 "data_offset": 2048, 00:12:34.027 "data_size": 63488 00:12:34.027 } 00:12:34.027 ] 00:12:34.027 } 00:12:34.027 } 00:12:34.027 }' 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:12:34.027 pt2 00:12:34.027 pt3' 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:34.027 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:34.285 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:34.285 "name": "pt1", 00:12:34.285 "aliases": [ 00:12:34.285 "090f9048-e7f1-5b0c-bd3e-10eae152fd3c" 00:12:34.285 ], 00:12:34.285 "product_name": "passthru", 00:12:34.285 "block_size": 512, 00:12:34.285 "num_blocks": 65536, 00:12:34.285 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:34.285 "assigned_rate_limits": { 00:12:34.285 "rw_ios_per_sec": 0, 00:12:34.285 "rw_mbytes_per_sec": 0, 00:12:34.285 "r_mbytes_per_sec": 0, 00:12:34.285 "w_mbytes_per_sec": 0 00:12:34.285 }, 00:12:34.285 "claimed": true, 00:12:34.285 "claim_type": "exclusive_write", 00:12:34.285 "zoned": false, 00:12:34.285 "supported_io_types": { 00:12:34.285 "read": true, 00:12:34.285 "write": true, 00:12:34.285 "unmap": true, 00:12:34.285 "write_zeroes": true, 00:12:34.285 "flush": true, 00:12:34.285 "reset": true, 00:12:34.285 "compare": false, 00:12:34.285 "compare_and_write": false, 00:12:34.285 "abort": true, 00:12:34.285 "nvme_admin": false, 00:12:34.285 "nvme_io": false 00:12:34.285 }, 00:12:34.285 "memory_domains": [ 00:12:34.285 { 00:12:34.285 "dma_device_id": "system", 00:12:34.285 "dma_device_type": 1 00:12:34.285 }, 00:12:34.285 { 00:12:34.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.285 "dma_device_type": 2 00:12:34.285 } 00:12:34.285 ], 00:12:34.285 "driver_specific": { 00:12:34.285 "passthru": { 00:12:34.285 "name": "pt1", 00:12:34.285 "base_bdev_name": "malloc1" 00:12:34.285 } 00:12:34.286 } 00:12:34.286 }' 00:12:34.286 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:34.286 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:34.543 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:34.801 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:34.801 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:34.801 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:34.801 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:35.059 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:35.059 "name": "pt2", 00:12:35.059 "aliases": [ 00:12:35.059 "afc41eb1-aa29-5491-b889-9145fb1348cb" 00:12:35.059 ], 00:12:35.059 "product_name": "passthru", 00:12:35.059 "block_size": 512, 00:12:35.059 "num_blocks": 65536, 00:12:35.059 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:35.059 "assigned_rate_limits": { 00:12:35.059 "rw_ios_per_sec": 0, 00:12:35.059 "rw_mbytes_per_sec": 0, 00:12:35.059 "r_mbytes_per_sec": 0, 00:12:35.059 "w_mbytes_per_sec": 0 00:12:35.059 }, 00:12:35.059 "claimed": true, 00:12:35.059 "claim_type": "exclusive_write", 00:12:35.059 "zoned": false, 00:12:35.059 "supported_io_types": { 00:12:35.059 "read": true, 00:12:35.059 "write": true, 00:12:35.059 "unmap": true, 00:12:35.059 "write_zeroes": true, 00:12:35.059 "flush": true, 00:12:35.059 "reset": true, 00:12:35.059 "compare": false, 00:12:35.059 "compare_and_write": false, 00:12:35.059 "abort": true, 00:12:35.059 "nvme_admin": false, 00:12:35.059 "nvme_io": false 00:12:35.059 }, 00:12:35.059 "memory_domains": [ 00:12:35.059 { 00:12:35.059 "dma_device_id": "system", 00:12:35.059 "dma_device_type": 1 00:12:35.059 }, 00:12:35.059 { 00:12:35.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.059 "dma_device_type": 2 00:12:35.059 } 00:12:35.059 ], 00:12:35.059 "driver_specific": { 00:12:35.059 "passthru": { 00:12:35.059 "name": "pt2", 00:12:35.059 "base_bdev_name": "malloc2" 00:12:35.059 } 00:12:35.059 } 00:12:35.059 }' 00:12:35.059 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:35.059 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:35.059 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:35.059 11:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:35.059 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:35.059 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.059 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:35.059 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:35.317 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:35.575 "name": "pt3", 00:12:35.575 "aliases": [ 00:12:35.575 "bc776ed5-55a2-53ce-b8eb-5373e03658c8" 00:12:35.575 ], 00:12:35.575 "product_name": "passthru", 00:12:35.575 "block_size": 512, 00:12:35.575 "num_blocks": 65536, 00:12:35.575 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:35.575 "assigned_rate_limits": { 00:12:35.575 "rw_ios_per_sec": 0, 00:12:35.575 "rw_mbytes_per_sec": 0, 00:12:35.575 "r_mbytes_per_sec": 0, 00:12:35.575 "w_mbytes_per_sec": 0 00:12:35.575 }, 00:12:35.575 "claimed": true, 00:12:35.575 "claim_type": "exclusive_write", 00:12:35.575 "zoned": false, 00:12:35.575 "supported_io_types": { 00:12:35.575 "read": true, 00:12:35.575 "write": true, 00:12:35.575 "unmap": true, 00:12:35.575 "write_zeroes": true, 00:12:35.575 "flush": true, 00:12:35.575 "reset": true, 00:12:35.575 "compare": false, 00:12:35.575 "compare_and_write": false, 00:12:35.575 "abort": true, 00:12:35.575 "nvme_admin": false, 00:12:35.575 "nvme_io": false 00:12:35.575 }, 00:12:35.575 "memory_domains": [ 00:12:35.575 { 00:12:35.575 "dma_device_id": "system", 00:12:35.575 "dma_device_type": 1 00:12:35.575 }, 00:12:35.575 { 00:12:35.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.575 "dma_device_type": 2 00:12:35.575 } 00:12:35.575 ], 00:12:35.575 "driver_specific": { 00:12:35.575 "passthru": { 00:12:35.575 "name": "pt3", 00:12:35.575 "base_bdev_name": "malloc3" 00:12:35.575 } 00:12:35.575 } 00:12:35.575 }' 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.575 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:35.866 11:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:12:36.125 [2024-05-14 11:49:03.042831] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:36.125 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7 00:12:36.125 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7 ']' 00:12:36.125 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:36.383 [2024-05-14 11:49:03.283214] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:36.383 [2024-05-14 11:49:03.283237] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:36.383 [2024-05-14 11:49:03.283292] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:36.383 [2024-05-14 11:49:03.283351] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:36.383 [2024-05-14 11:49:03.283364] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f6dc0 name raid_bdev1, state offline 00:12:36.383 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.383 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:12:36.641 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:12:36.641 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:12:36.641 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:36.641 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:36.900 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:36.900 11:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:37.159 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:37.160 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:37.418 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:37.418 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:37.676 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:37.677 [2024-05-14 11:49:04.734990] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:37.677 [2024-05-14 11:49:04.736384] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:37.677 [2024-05-14 11:49:04.736434] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:37.677 [2024-05-14 11:49:04.736480] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:37.677 [2024-05-14 11:49:04.736520] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:37.677 [2024-05-14 11:49:04.736543] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:37.677 [2024-05-14 11:49:04.736567] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:37.677 [2024-05-14 11:49:04.736578] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f5ff0 name raid_bdev1, state configuring 00:12:37.677 request: 00:12:37.677 { 00:12:37.677 "name": "raid_bdev1", 00:12:37.677 "raid_level": "raid0", 00:12:37.677 "base_bdevs": [ 00:12:37.677 "malloc1", 00:12:37.677 "malloc2", 00:12:37.677 "malloc3" 00:12:37.677 ], 00:12:37.677 "superblock": false, 00:12:37.677 "strip_size_kb": 64, 00:12:37.677 "method": "bdev_raid_create", 00:12:37.677 "req_id": 1 00:12:37.677 } 00:12:37.677 Got JSON-RPC error response 00:12:37.677 response: 00:12:37.677 { 00:12:37.677 "code": -17, 00:12:37.677 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:37.677 } 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.677 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:12:37.936 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:12:37.936 11:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:12:37.936 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:38.196 [2024-05-14 11:49:05.220206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:38.196 [2024-05-14 11:49:05.220258] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:38.196 [2024-05-14 11:49:05.220281] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a4060 00:12:38.196 [2024-05-14 11:49:05.220294] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:38.196 [2024-05-14 11:49:05.221901] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:38.196 [2024-05-14 11:49:05.221931] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:38.196 [2024-05-14 11:49:05.222008] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:12:38.196 [2024-05-14 11:49:05.222035] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:38.196 pt1 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.196 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.454 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:38.454 "name": "raid_bdev1", 00:12:38.454 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:38.454 "strip_size_kb": 64, 00:12:38.454 "state": "configuring", 00:12:38.454 "raid_level": "raid0", 00:12:38.454 "superblock": true, 00:12:38.454 "num_base_bdevs": 3, 00:12:38.454 "num_base_bdevs_discovered": 1, 00:12:38.454 "num_base_bdevs_operational": 3, 00:12:38.454 "base_bdevs_list": [ 00:12:38.454 { 00:12:38.454 "name": "pt1", 00:12:38.454 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:38.454 "is_configured": true, 00:12:38.454 "data_offset": 2048, 00:12:38.454 "data_size": 63488 00:12:38.454 }, 00:12:38.454 { 00:12:38.454 "name": null, 00:12:38.454 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:38.454 "is_configured": false, 00:12:38.454 "data_offset": 2048, 00:12:38.454 "data_size": 63488 00:12:38.454 }, 00:12:38.454 { 00:12:38.454 "name": null, 00:12:38.454 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:38.454 "is_configured": false, 00:12:38.454 "data_offset": 2048, 00:12:38.454 "data_size": 63488 00:12:38.454 } 00:12:38.454 ] 00:12:38.454 }' 00:12:38.454 11:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:38.454 11:49:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.022 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:12:39.022 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:39.280 [2024-05-14 11:49:06.271001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:39.280 [2024-05-14 11:49:06.271056] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.280 [2024-05-14 11:49:06.271076] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26fc0f0 00:12:39.280 [2024-05-14 11:49:06.271088] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.280 [2024-05-14 11:49:06.271435] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.280 [2024-05-14 11:49:06.271453] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:39.280 [2024-05-14 11:49:06.271523] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:12:39.280 [2024-05-14 11:49:06.271544] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:39.280 pt2 00:12:39.280 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:39.539 [2024-05-14 11:49:06.519677] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.539 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:39.799 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:39.799 "name": "raid_bdev1", 00:12:39.799 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:39.799 "strip_size_kb": 64, 00:12:39.799 "state": "configuring", 00:12:39.799 "raid_level": "raid0", 00:12:39.799 "superblock": true, 00:12:39.799 "num_base_bdevs": 3, 00:12:39.799 "num_base_bdevs_discovered": 1, 00:12:39.799 "num_base_bdevs_operational": 3, 00:12:39.799 "base_bdevs_list": [ 00:12:39.799 { 00:12:39.799 "name": "pt1", 00:12:39.799 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:39.799 "is_configured": true, 00:12:39.799 "data_offset": 2048, 00:12:39.799 "data_size": 63488 00:12:39.799 }, 00:12:39.799 { 00:12:39.799 "name": null, 00:12:39.799 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:39.799 "is_configured": false, 00:12:39.799 "data_offset": 2048, 00:12:39.799 "data_size": 63488 00:12:39.799 }, 00:12:39.799 { 00:12:39.799 "name": null, 00:12:39.799 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:39.799 "is_configured": false, 00:12:39.799 "data_offset": 2048, 00:12:39.799 "data_size": 63488 00:12:39.799 } 00:12:39.799 ] 00:12:39.799 }' 00:12:39.799 11:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:39.799 11:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.367 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:12:40.367 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:40.367 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:40.626 [2024-05-14 11:49:07.518317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:40.626 [2024-05-14 11:49:07.518367] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.626 [2024-05-14 11:49:07.518386] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f3530 00:12:40.626 [2024-05-14 11:49:07.518405] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.626 [2024-05-14 11:49:07.518747] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.626 [2024-05-14 11:49:07.518764] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:40.626 [2024-05-14 11:49:07.518826] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:12:40.626 [2024-05-14 11:49:07.518846] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:40.626 pt2 00:12:40.626 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:12:40.626 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:40.626 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:40.885 [2024-05-14 11:49:07.762956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:40.885 [2024-05-14 11:49:07.762991] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.885 [2024-05-14 11:49:07.763011] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f38c0 00:12:40.885 [2024-05-14 11:49:07.763023] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.885 [2024-05-14 11:49:07.763312] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.885 [2024-05-14 11:49:07.763329] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:40.885 [2024-05-14 11:49:07.763381] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:12:40.885 [2024-05-14 11:49:07.763407] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:40.885 [2024-05-14 11:49:07.763509] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x26f77b0 00:12:40.885 [2024-05-14 11:49:07.763519] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:40.885 [2024-05-14 11:49:07.763684] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26fc8d0 00:12:40.885 [2024-05-14 11:49:07.763808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26f77b0 00:12:40.885 [2024-05-14 11:49:07.763818] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26f77b0 00:12:40.885 [2024-05-14 11:49:07.763912] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:40.885 pt3 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.885 11:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:41.143 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:41.143 "name": "raid_bdev1", 00:12:41.143 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:41.143 "strip_size_kb": 64, 00:12:41.143 "state": "online", 00:12:41.143 "raid_level": "raid0", 00:12:41.143 "superblock": true, 00:12:41.143 "num_base_bdevs": 3, 00:12:41.143 "num_base_bdevs_discovered": 3, 00:12:41.144 "num_base_bdevs_operational": 3, 00:12:41.144 "base_bdevs_list": [ 00:12:41.144 { 00:12:41.144 "name": "pt1", 00:12:41.144 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:41.144 "is_configured": true, 00:12:41.144 "data_offset": 2048, 00:12:41.144 "data_size": 63488 00:12:41.144 }, 00:12:41.144 { 00:12:41.144 "name": "pt2", 00:12:41.144 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:41.144 "is_configured": true, 00:12:41.144 "data_offset": 2048, 00:12:41.144 "data_size": 63488 00:12:41.144 }, 00:12:41.144 { 00:12:41.144 "name": "pt3", 00:12:41.144 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:41.144 "is_configured": true, 00:12:41.144 "data_offset": 2048, 00:12:41.144 "data_size": 63488 00:12:41.144 } 00:12:41.144 ] 00:12:41.144 }' 00:12:41.144 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:41.144 11:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:41.741 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:42.001 [2024-05-14 11:49:08.854115] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:42.001 "name": "raid_bdev1", 00:12:42.001 "aliases": [ 00:12:42.001 "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7" 00:12:42.001 ], 00:12:42.001 "product_name": "Raid Volume", 00:12:42.001 "block_size": 512, 00:12:42.001 "num_blocks": 190464, 00:12:42.001 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:42.001 "assigned_rate_limits": { 00:12:42.001 "rw_ios_per_sec": 0, 00:12:42.001 "rw_mbytes_per_sec": 0, 00:12:42.001 "r_mbytes_per_sec": 0, 00:12:42.001 "w_mbytes_per_sec": 0 00:12:42.001 }, 00:12:42.001 "claimed": false, 00:12:42.001 "zoned": false, 00:12:42.001 "supported_io_types": { 00:12:42.001 "read": true, 00:12:42.001 "write": true, 00:12:42.001 "unmap": true, 00:12:42.001 "write_zeroes": true, 00:12:42.001 "flush": true, 00:12:42.001 "reset": true, 00:12:42.001 "compare": false, 00:12:42.001 "compare_and_write": false, 00:12:42.001 "abort": false, 00:12:42.001 "nvme_admin": false, 00:12:42.001 "nvme_io": false 00:12:42.001 }, 00:12:42.001 "memory_domains": [ 00:12:42.001 { 00:12:42.001 "dma_device_id": "system", 00:12:42.001 "dma_device_type": 1 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.001 "dma_device_type": 2 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "dma_device_id": "system", 00:12:42.001 "dma_device_type": 1 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.001 "dma_device_type": 2 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "dma_device_id": "system", 00:12:42.001 "dma_device_type": 1 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.001 "dma_device_type": 2 00:12:42.001 } 00:12:42.001 ], 00:12:42.001 "driver_specific": { 00:12:42.001 "raid": { 00:12:42.001 "uuid": "e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7", 00:12:42.001 "strip_size_kb": 64, 00:12:42.001 "state": "online", 00:12:42.001 "raid_level": "raid0", 00:12:42.001 "superblock": true, 00:12:42.001 "num_base_bdevs": 3, 00:12:42.001 "num_base_bdevs_discovered": 3, 00:12:42.001 "num_base_bdevs_operational": 3, 00:12:42.001 "base_bdevs_list": [ 00:12:42.001 { 00:12:42.001 "name": "pt1", 00:12:42.001 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:42.001 "is_configured": true, 00:12:42.001 "data_offset": 2048, 00:12:42.001 "data_size": 63488 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "name": "pt2", 00:12:42.001 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:42.001 "is_configured": true, 00:12:42.001 "data_offset": 2048, 00:12:42.001 "data_size": 63488 00:12:42.001 }, 00:12:42.001 { 00:12:42.001 "name": "pt3", 00:12:42.001 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:42.001 "is_configured": true, 00:12:42.001 "data_offset": 2048, 00:12:42.001 "data_size": 63488 00:12:42.001 } 00:12:42.001 ] 00:12:42.001 } 00:12:42.001 } 00:12:42.001 }' 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:12:42.001 pt2 00:12:42.001 pt3' 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:42.001 11:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:42.260 "name": "pt1", 00:12:42.260 "aliases": [ 00:12:42.260 "090f9048-e7f1-5b0c-bd3e-10eae152fd3c" 00:12:42.260 ], 00:12:42.260 "product_name": "passthru", 00:12:42.260 "block_size": 512, 00:12:42.260 "num_blocks": 65536, 00:12:42.260 "uuid": "090f9048-e7f1-5b0c-bd3e-10eae152fd3c", 00:12:42.260 "assigned_rate_limits": { 00:12:42.260 "rw_ios_per_sec": 0, 00:12:42.260 "rw_mbytes_per_sec": 0, 00:12:42.260 "r_mbytes_per_sec": 0, 00:12:42.260 "w_mbytes_per_sec": 0 00:12:42.260 }, 00:12:42.260 "claimed": true, 00:12:42.260 "claim_type": "exclusive_write", 00:12:42.260 "zoned": false, 00:12:42.260 "supported_io_types": { 00:12:42.260 "read": true, 00:12:42.260 "write": true, 00:12:42.260 "unmap": true, 00:12:42.260 "write_zeroes": true, 00:12:42.260 "flush": true, 00:12:42.260 "reset": true, 00:12:42.260 "compare": false, 00:12:42.260 "compare_and_write": false, 00:12:42.260 "abort": true, 00:12:42.260 "nvme_admin": false, 00:12:42.260 "nvme_io": false 00:12:42.260 }, 00:12:42.260 "memory_domains": [ 00:12:42.260 { 00:12:42.260 "dma_device_id": "system", 00:12:42.260 "dma_device_type": 1 00:12:42.260 }, 00:12:42.260 { 00:12:42.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.260 "dma_device_type": 2 00:12:42.260 } 00:12:42.260 ], 00:12:42.260 "driver_specific": { 00:12:42.260 "passthru": { 00:12:42.260 "name": "pt1", 00:12:42.260 "base_bdev_name": "malloc1" 00:12:42.260 } 00:12:42.260 } 00:12:42.260 }' 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.260 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:42.518 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:42.777 "name": "pt2", 00:12:42.777 "aliases": [ 00:12:42.777 "afc41eb1-aa29-5491-b889-9145fb1348cb" 00:12:42.777 ], 00:12:42.777 "product_name": "passthru", 00:12:42.777 "block_size": 512, 00:12:42.777 "num_blocks": 65536, 00:12:42.777 "uuid": "afc41eb1-aa29-5491-b889-9145fb1348cb", 00:12:42.777 "assigned_rate_limits": { 00:12:42.777 "rw_ios_per_sec": 0, 00:12:42.777 "rw_mbytes_per_sec": 0, 00:12:42.777 "r_mbytes_per_sec": 0, 00:12:42.777 "w_mbytes_per_sec": 0 00:12:42.777 }, 00:12:42.777 "claimed": true, 00:12:42.777 "claim_type": "exclusive_write", 00:12:42.777 "zoned": false, 00:12:42.777 "supported_io_types": { 00:12:42.777 "read": true, 00:12:42.777 "write": true, 00:12:42.777 "unmap": true, 00:12:42.777 "write_zeroes": true, 00:12:42.777 "flush": true, 00:12:42.777 "reset": true, 00:12:42.777 "compare": false, 00:12:42.777 "compare_and_write": false, 00:12:42.777 "abort": true, 00:12:42.777 "nvme_admin": false, 00:12:42.777 "nvme_io": false 00:12:42.777 }, 00:12:42.777 "memory_domains": [ 00:12:42.777 { 00:12:42.777 "dma_device_id": "system", 00:12:42.777 "dma_device_type": 1 00:12:42.777 }, 00:12:42.777 { 00:12:42.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.777 "dma_device_type": 2 00:12:42.777 } 00:12:42.777 ], 00:12:42.777 "driver_specific": { 00:12:42.777 "passthru": { 00:12:42.777 "name": "pt2", 00:12:42.777 "base_bdev_name": "malloc2" 00:12:42.777 } 00:12:42.777 } 00:12:42.777 }' 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.777 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:43.035 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.035 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:43.035 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:43.035 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.035 11:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:43.035 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:43.035 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:43.035 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:43.035 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:43.035 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:43.295 "name": "pt3", 00:12:43.295 "aliases": [ 00:12:43.295 "bc776ed5-55a2-53ce-b8eb-5373e03658c8" 00:12:43.295 ], 00:12:43.295 "product_name": "passthru", 00:12:43.295 "block_size": 512, 00:12:43.295 "num_blocks": 65536, 00:12:43.295 "uuid": "bc776ed5-55a2-53ce-b8eb-5373e03658c8", 00:12:43.295 "assigned_rate_limits": { 00:12:43.295 "rw_ios_per_sec": 0, 00:12:43.295 "rw_mbytes_per_sec": 0, 00:12:43.295 "r_mbytes_per_sec": 0, 00:12:43.295 "w_mbytes_per_sec": 0 00:12:43.295 }, 00:12:43.295 "claimed": true, 00:12:43.295 "claim_type": "exclusive_write", 00:12:43.295 "zoned": false, 00:12:43.295 "supported_io_types": { 00:12:43.295 "read": true, 00:12:43.295 "write": true, 00:12:43.295 "unmap": true, 00:12:43.295 "write_zeroes": true, 00:12:43.295 "flush": true, 00:12:43.295 "reset": true, 00:12:43.295 "compare": false, 00:12:43.295 "compare_and_write": false, 00:12:43.295 "abort": true, 00:12:43.295 "nvme_admin": false, 00:12:43.295 "nvme_io": false 00:12:43.295 }, 00:12:43.295 "memory_domains": [ 00:12:43.295 { 00:12:43.295 "dma_device_id": "system", 00:12:43.295 "dma_device_type": 1 00:12:43.295 }, 00:12:43.295 { 00:12:43.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.295 "dma_device_type": 2 00:12:43.295 } 00:12:43.295 ], 00:12:43.295 "driver_specific": { 00:12:43.295 "passthru": { 00:12:43.295 "name": "pt3", 00:12:43.295 "base_bdev_name": "malloc3" 00:12:43.295 } 00:12:43.295 } 00:12:43.295 }' 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:43.295 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.554 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:12:43.813 [2024-05-14 11:49:10.743109] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7 '!=' e821bbdd-94b7-46cd-9a69-c7ad6b6fe6b7 ']' 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1687421 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1687421 ']' 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1687421 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1687421 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1687421' 00:12:43.813 killing process with pid 1687421 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1687421 00:12:43.813 [2024-05-14 11:49:10.819320] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:43.813 [2024-05-14 11:49:10.819384] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:43.813 [2024-05-14 11:49:10.819460] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:43.813 [2024-05-14 11:49:10.819475] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f77b0 name raid_bdev1, state offline 00:12:43.813 11:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1687421 00:12:43.813 [2024-05-14 11:49:10.848032] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:44.071 11:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:12:44.071 00:12:44.071 real 0m13.846s 00:12:44.071 user 0m24.909s 00:12:44.071 sys 0m2.515s 00:12:44.071 11:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:44.071 11:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.071 ************************************ 00:12:44.071 END TEST raid_superblock_test 00:12:44.071 ************************************ 00:12:44.071 11:49:11 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:12:44.071 11:49:11 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:12:44.071 11:49:11 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:12:44.071 11:49:11 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:44.071 11:49:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:44.071 ************************************ 00:12:44.071 START TEST raid_state_function_test 00:12:44.071 ************************************ 00:12:44.071 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 false 00:12:44.071 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:12:44.072 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1689607 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1689607' 00:12:44.331 Process raid pid: 1689607 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1689607 /var/tmp/spdk-raid.sock 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1689607 ']' 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:44.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:44.331 11:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.331 [2024-05-14 11:49:11.221236] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:12:44.331 [2024-05-14 11:49:11.221297] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:44.331 [2024-05-14 11:49:11.349319] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:44.589 [2024-05-14 11:49:11.453431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.589 [2024-05-14 11:49:11.517254] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:44.589 [2024-05-14 11:49:11.517291] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.155 11:49:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:45.155 11:49:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:12:45.155 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:45.413 [2024-05-14 11:49:12.364857] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:45.413 [2024-05-14 11:49:12.364898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:45.413 [2024-05-14 11:49:12.364910] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:45.413 [2024-05-14 11:49:12.364922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:45.413 [2024-05-14 11:49:12.364931] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:45.413 [2024-05-14 11:49:12.364951] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.413 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.671 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:45.671 "name": "Existed_Raid", 00:12:45.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.671 "strip_size_kb": 64, 00:12:45.671 "state": "configuring", 00:12:45.671 "raid_level": "concat", 00:12:45.671 "superblock": false, 00:12:45.671 "num_base_bdevs": 3, 00:12:45.671 "num_base_bdevs_discovered": 0, 00:12:45.671 "num_base_bdevs_operational": 3, 00:12:45.671 "base_bdevs_list": [ 00:12:45.671 { 00:12:45.671 "name": "BaseBdev1", 00:12:45.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.671 "is_configured": false, 00:12:45.671 "data_offset": 0, 00:12:45.671 "data_size": 0 00:12:45.671 }, 00:12:45.671 { 00:12:45.671 "name": "BaseBdev2", 00:12:45.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.671 "is_configured": false, 00:12:45.671 "data_offset": 0, 00:12:45.671 "data_size": 0 00:12:45.671 }, 00:12:45.671 { 00:12:45.671 "name": "BaseBdev3", 00:12:45.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.671 "is_configured": false, 00:12:45.671 "data_offset": 0, 00:12:45.671 "data_size": 0 00:12:45.671 } 00:12:45.671 ] 00:12:45.671 }' 00:12:45.671 11:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:45.671 11:49:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.237 11:49:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:46.495 [2024-05-14 11:49:13.439571] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:46.495 [2024-05-14 11:49:13.439600] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19a0700 name Existed_Raid, state configuring 00:12:46.495 11:49:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:46.754 [2024-05-14 11:49:13.684224] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:46.754 [2024-05-14 11:49:13.684251] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:46.754 [2024-05-14 11:49:13.684261] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:46.754 [2024-05-14 11:49:13.684273] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:46.754 [2024-05-14 11:49:13.684282] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:46.754 [2024-05-14 11:49:13.684293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:46.754 11:49:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:47.013 [2024-05-14 11:49:13.942661] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:47.013 BaseBdev1 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:47.013 11:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:47.271 11:49:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:47.530 [ 00:12:47.530 { 00:12:47.530 "name": "BaseBdev1", 00:12:47.530 "aliases": [ 00:12:47.530 "96299a04-cc23-44fb-9338-fe09ea2ae9df" 00:12:47.530 ], 00:12:47.530 "product_name": "Malloc disk", 00:12:47.530 "block_size": 512, 00:12:47.530 "num_blocks": 65536, 00:12:47.530 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:47.530 "assigned_rate_limits": { 00:12:47.530 "rw_ios_per_sec": 0, 00:12:47.530 "rw_mbytes_per_sec": 0, 00:12:47.530 "r_mbytes_per_sec": 0, 00:12:47.530 "w_mbytes_per_sec": 0 00:12:47.530 }, 00:12:47.530 "claimed": true, 00:12:47.530 "claim_type": "exclusive_write", 00:12:47.530 "zoned": false, 00:12:47.530 "supported_io_types": { 00:12:47.530 "read": true, 00:12:47.530 "write": true, 00:12:47.530 "unmap": true, 00:12:47.530 "write_zeroes": true, 00:12:47.530 "flush": true, 00:12:47.530 "reset": true, 00:12:47.530 "compare": false, 00:12:47.530 "compare_and_write": false, 00:12:47.530 "abort": true, 00:12:47.530 "nvme_admin": false, 00:12:47.530 "nvme_io": false 00:12:47.530 }, 00:12:47.530 "memory_domains": [ 00:12:47.530 { 00:12:47.530 "dma_device_id": "system", 00:12:47.530 "dma_device_type": 1 00:12:47.530 }, 00:12:47.530 { 00:12:47.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.530 "dma_device_type": 2 00:12:47.530 } 00:12:47.530 ], 00:12:47.530 "driver_specific": {} 00:12:47.530 } 00:12:47.530 ] 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.530 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.790 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:47.790 "name": "Existed_Raid", 00:12:47.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.790 "strip_size_kb": 64, 00:12:47.790 "state": "configuring", 00:12:47.790 "raid_level": "concat", 00:12:47.790 "superblock": false, 00:12:47.790 "num_base_bdevs": 3, 00:12:47.790 "num_base_bdevs_discovered": 1, 00:12:47.790 "num_base_bdevs_operational": 3, 00:12:47.790 "base_bdevs_list": [ 00:12:47.790 { 00:12:47.790 "name": "BaseBdev1", 00:12:47.790 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:47.790 "is_configured": true, 00:12:47.790 "data_offset": 0, 00:12:47.790 "data_size": 65536 00:12:47.790 }, 00:12:47.790 { 00:12:47.790 "name": "BaseBdev2", 00:12:47.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.790 "is_configured": false, 00:12:47.790 "data_offset": 0, 00:12:47.790 "data_size": 0 00:12:47.790 }, 00:12:47.790 { 00:12:47.790 "name": "BaseBdev3", 00:12:47.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.790 "is_configured": false, 00:12:47.790 "data_offset": 0, 00:12:47.790 "data_size": 0 00:12:47.790 } 00:12:47.790 ] 00:12:47.790 }' 00:12:47.790 11:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:47.790 11:49:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.355 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:48.614 [2024-05-14 11:49:15.454633] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:48.614 [2024-05-14 11:49:15.454670] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x199fff0 name Existed_Raid, state configuring 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:48.614 [2024-05-14 11:49:15.635150] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:48.614 [2024-05-14 11:49:15.636719] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:48.614 [2024-05-14 11:49:15.636754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:48.614 [2024-05-14 11:49:15.636764] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:48.614 [2024-05-14 11:49:15.636776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.614 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.873 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:48.873 "name": "Existed_Raid", 00:12:48.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.873 "strip_size_kb": 64, 00:12:48.873 "state": "configuring", 00:12:48.873 "raid_level": "concat", 00:12:48.873 "superblock": false, 00:12:48.873 "num_base_bdevs": 3, 00:12:48.873 "num_base_bdevs_discovered": 1, 00:12:48.873 "num_base_bdevs_operational": 3, 00:12:48.873 "base_bdevs_list": [ 00:12:48.873 { 00:12:48.873 "name": "BaseBdev1", 00:12:48.873 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:48.873 "is_configured": true, 00:12:48.873 "data_offset": 0, 00:12:48.873 "data_size": 65536 00:12:48.873 }, 00:12:48.873 { 00:12:48.873 "name": "BaseBdev2", 00:12:48.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.873 "is_configured": false, 00:12:48.873 "data_offset": 0, 00:12:48.873 "data_size": 0 00:12:48.873 }, 00:12:48.873 { 00:12:48.873 "name": "BaseBdev3", 00:12:48.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.873 "is_configured": false, 00:12:48.873 "data_offset": 0, 00:12:48.873 "data_size": 0 00:12:48.873 } 00:12:48.873 ] 00:12:48.873 }' 00:12:48.873 11:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:48.873 11:49:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.440 11:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:49.698 [2024-05-14 11:49:16.581035] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:49.698 BaseBdev2 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:49.698 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:49.699 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:49.957 11:49:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:50.216 [ 00:12:50.216 { 00:12:50.216 "name": "BaseBdev2", 00:12:50.216 "aliases": [ 00:12:50.216 "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a" 00:12:50.216 ], 00:12:50.216 "product_name": "Malloc disk", 00:12:50.216 "block_size": 512, 00:12:50.216 "num_blocks": 65536, 00:12:50.216 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:50.216 "assigned_rate_limits": { 00:12:50.216 "rw_ios_per_sec": 0, 00:12:50.216 "rw_mbytes_per_sec": 0, 00:12:50.216 "r_mbytes_per_sec": 0, 00:12:50.216 "w_mbytes_per_sec": 0 00:12:50.216 }, 00:12:50.216 "claimed": true, 00:12:50.216 "claim_type": "exclusive_write", 00:12:50.216 "zoned": false, 00:12:50.216 "supported_io_types": { 00:12:50.216 "read": true, 00:12:50.216 "write": true, 00:12:50.216 "unmap": true, 00:12:50.216 "write_zeroes": true, 00:12:50.216 "flush": true, 00:12:50.216 "reset": true, 00:12:50.216 "compare": false, 00:12:50.216 "compare_and_write": false, 00:12:50.216 "abort": true, 00:12:50.216 "nvme_admin": false, 00:12:50.216 "nvme_io": false 00:12:50.216 }, 00:12:50.216 "memory_domains": [ 00:12:50.216 { 00:12:50.216 "dma_device_id": "system", 00:12:50.216 "dma_device_type": 1 00:12:50.216 }, 00:12:50.216 { 00:12:50.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.216 "dma_device_type": 2 00:12:50.216 } 00:12:50.216 ], 00:12:50.216 "driver_specific": {} 00:12:50.216 } 00:12:50.216 ] 00:12:50.216 11:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:50.216 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.217 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.476 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:50.476 "name": "Existed_Raid", 00:12:50.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.476 "strip_size_kb": 64, 00:12:50.476 "state": "configuring", 00:12:50.476 "raid_level": "concat", 00:12:50.476 "superblock": false, 00:12:50.476 "num_base_bdevs": 3, 00:12:50.476 "num_base_bdevs_discovered": 2, 00:12:50.476 "num_base_bdevs_operational": 3, 00:12:50.476 "base_bdevs_list": [ 00:12:50.476 { 00:12:50.476 "name": "BaseBdev1", 00:12:50.476 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:50.476 "is_configured": true, 00:12:50.476 "data_offset": 0, 00:12:50.476 "data_size": 65536 00:12:50.476 }, 00:12:50.476 { 00:12:50.476 "name": "BaseBdev2", 00:12:50.476 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:50.476 "is_configured": true, 00:12:50.476 "data_offset": 0, 00:12:50.476 "data_size": 65536 00:12:50.476 }, 00:12:50.476 { 00:12:50.476 "name": "BaseBdev3", 00:12:50.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.476 "is_configured": false, 00:12:50.476 "data_offset": 0, 00:12:50.476 "data_size": 0 00:12:50.476 } 00:12:50.476 ] 00:12:50.476 }' 00:12:50.476 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:50.476 11:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.043 11:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:51.302 [2024-05-14 11:49:18.168639] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:51.302 [2024-05-14 11:49:18.168676] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19a1080 00:12:51.302 [2024-05-14 11:49:18.168684] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:51.302 [2024-05-14 11:49:18.168879] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a0d50 00:12:51.302 [2024-05-14 11:49:18.169002] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19a1080 00:12:51.302 [2024-05-14 11:49:18.169012] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19a1080 00:12:51.302 [2024-05-14 11:49:18.169177] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.302 BaseBdev3 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:51.302 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.560 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:51.818 [ 00:12:51.818 { 00:12:51.818 "name": "BaseBdev3", 00:12:51.818 "aliases": [ 00:12:51.818 "e1fa2d0e-e219-46a5-ab51-1b44695dcc79" 00:12:51.818 ], 00:12:51.818 "product_name": "Malloc disk", 00:12:51.818 "block_size": 512, 00:12:51.818 "num_blocks": 65536, 00:12:51.818 "uuid": "e1fa2d0e-e219-46a5-ab51-1b44695dcc79", 00:12:51.818 "assigned_rate_limits": { 00:12:51.818 "rw_ios_per_sec": 0, 00:12:51.818 "rw_mbytes_per_sec": 0, 00:12:51.818 "r_mbytes_per_sec": 0, 00:12:51.818 "w_mbytes_per_sec": 0 00:12:51.818 }, 00:12:51.818 "claimed": true, 00:12:51.818 "claim_type": "exclusive_write", 00:12:51.818 "zoned": false, 00:12:51.819 "supported_io_types": { 00:12:51.819 "read": true, 00:12:51.819 "write": true, 00:12:51.819 "unmap": true, 00:12:51.819 "write_zeroes": true, 00:12:51.819 "flush": true, 00:12:51.819 "reset": true, 00:12:51.819 "compare": false, 00:12:51.819 "compare_and_write": false, 00:12:51.819 "abort": true, 00:12:51.819 "nvme_admin": false, 00:12:51.819 "nvme_io": false 00:12:51.819 }, 00:12:51.819 "memory_domains": [ 00:12:51.819 { 00:12:51.819 "dma_device_id": "system", 00:12:51.819 "dma_device_type": 1 00:12:51.819 }, 00:12:51.819 { 00:12:51.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.819 "dma_device_type": 2 00:12:51.819 } 00:12:51.819 ], 00:12:51.819 "driver_specific": {} 00:12:51.819 } 00:12:51.819 ] 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.819 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.076 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:52.076 "name": "Existed_Raid", 00:12:52.076 "uuid": "7776f7aa-1a7d-4fda-8bd5-dbcd2ced82ba", 00:12:52.076 "strip_size_kb": 64, 00:12:52.076 "state": "online", 00:12:52.076 "raid_level": "concat", 00:12:52.076 "superblock": false, 00:12:52.076 "num_base_bdevs": 3, 00:12:52.076 "num_base_bdevs_discovered": 3, 00:12:52.076 "num_base_bdevs_operational": 3, 00:12:52.076 "base_bdevs_list": [ 00:12:52.076 { 00:12:52.076 "name": "BaseBdev1", 00:12:52.076 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:52.076 "is_configured": true, 00:12:52.076 "data_offset": 0, 00:12:52.076 "data_size": 65536 00:12:52.076 }, 00:12:52.076 { 00:12:52.076 "name": "BaseBdev2", 00:12:52.076 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:52.076 "is_configured": true, 00:12:52.076 "data_offset": 0, 00:12:52.076 "data_size": 65536 00:12:52.076 }, 00:12:52.076 { 00:12:52.076 "name": "BaseBdev3", 00:12:52.076 "uuid": "e1fa2d0e-e219-46a5-ab51-1b44695dcc79", 00:12:52.076 "is_configured": true, 00:12:52.076 "data_offset": 0, 00:12:52.076 "data_size": 65536 00:12:52.076 } 00:12:52.076 ] 00:12:52.076 }' 00:12:52.076 11:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:52.076 11:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:52.642 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:52.901 [2024-05-14 11:49:19.749099] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:52.901 "name": "Existed_Raid", 00:12:52.901 "aliases": [ 00:12:52.901 "7776f7aa-1a7d-4fda-8bd5-dbcd2ced82ba" 00:12:52.901 ], 00:12:52.901 "product_name": "Raid Volume", 00:12:52.901 "block_size": 512, 00:12:52.901 "num_blocks": 196608, 00:12:52.901 "uuid": "7776f7aa-1a7d-4fda-8bd5-dbcd2ced82ba", 00:12:52.901 "assigned_rate_limits": { 00:12:52.901 "rw_ios_per_sec": 0, 00:12:52.901 "rw_mbytes_per_sec": 0, 00:12:52.901 "r_mbytes_per_sec": 0, 00:12:52.901 "w_mbytes_per_sec": 0 00:12:52.901 }, 00:12:52.901 "claimed": false, 00:12:52.901 "zoned": false, 00:12:52.901 "supported_io_types": { 00:12:52.901 "read": true, 00:12:52.901 "write": true, 00:12:52.901 "unmap": true, 00:12:52.901 "write_zeroes": true, 00:12:52.901 "flush": true, 00:12:52.901 "reset": true, 00:12:52.901 "compare": false, 00:12:52.901 "compare_and_write": false, 00:12:52.901 "abort": false, 00:12:52.901 "nvme_admin": false, 00:12:52.901 "nvme_io": false 00:12:52.901 }, 00:12:52.901 "memory_domains": [ 00:12:52.901 { 00:12:52.901 "dma_device_id": "system", 00:12:52.901 "dma_device_type": 1 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.901 "dma_device_type": 2 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "dma_device_id": "system", 00:12:52.901 "dma_device_type": 1 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.901 "dma_device_type": 2 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "dma_device_id": "system", 00:12:52.901 "dma_device_type": 1 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.901 "dma_device_type": 2 00:12:52.901 } 00:12:52.901 ], 00:12:52.901 "driver_specific": { 00:12:52.901 "raid": { 00:12:52.901 "uuid": "7776f7aa-1a7d-4fda-8bd5-dbcd2ced82ba", 00:12:52.901 "strip_size_kb": 64, 00:12:52.901 "state": "online", 00:12:52.901 "raid_level": "concat", 00:12:52.901 "superblock": false, 00:12:52.901 "num_base_bdevs": 3, 00:12:52.901 "num_base_bdevs_discovered": 3, 00:12:52.901 "num_base_bdevs_operational": 3, 00:12:52.901 "base_bdevs_list": [ 00:12:52.901 { 00:12:52.901 "name": "BaseBdev1", 00:12:52.901 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:52.901 "is_configured": true, 00:12:52.901 "data_offset": 0, 00:12:52.901 "data_size": 65536 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "name": "BaseBdev2", 00:12:52.901 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:52.901 "is_configured": true, 00:12:52.901 "data_offset": 0, 00:12:52.901 "data_size": 65536 00:12:52.901 }, 00:12:52.901 { 00:12:52.901 "name": "BaseBdev3", 00:12:52.901 "uuid": "e1fa2d0e-e219-46a5-ab51-1b44695dcc79", 00:12:52.901 "is_configured": true, 00:12:52.901 "data_offset": 0, 00:12:52.901 "data_size": 65536 00:12:52.901 } 00:12:52.901 ] 00:12:52.901 } 00:12:52.901 } 00:12:52.901 }' 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:12:52.901 BaseBdev2 00:12:52.901 BaseBdev3' 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:52.901 11:49:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:53.159 "name": "BaseBdev1", 00:12:53.159 "aliases": [ 00:12:53.159 "96299a04-cc23-44fb-9338-fe09ea2ae9df" 00:12:53.159 ], 00:12:53.159 "product_name": "Malloc disk", 00:12:53.159 "block_size": 512, 00:12:53.159 "num_blocks": 65536, 00:12:53.159 "uuid": "96299a04-cc23-44fb-9338-fe09ea2ae9df", 00:12:53.159 "assigned_rate_limits": { 00:12:53.159 "rw_ios_per_sec": 0, 00:12:53.159 "rw_mbytes_per_sec": 0, 00:12:53.159 "r_mbytes_per_sec": 0, 00:12:53.159 "w_mbytes_per_sec": 0 00:12:53.159 }, 00:12:53.159 "claimed": true, 00:12:53.159 "claim_type": "exclusive_write", 00:12:53.159 "zoned": false, 00:12:53.159 "supported_io_types": { 00:12:53.159 "read": true, 00:12:53.159 "write": true, 00:12:53.159 "unmap": true, 00:12:53.159 "write_zeroes": true, 00:12:53.159 "flush": true, 00:12:53.159 "reset": true, 00:12:53.159 "compare": false, 00:12:53.159 "compare_and_write": false, 00:12:53.159 "abort": true, 00:12:53.159 "nvme_admin": false, 00:12:53.159 "nvme_io": false 00:12:53.159 }, 00:12:53.159 "memory_domains": [ 00:12:53.159 { 00:12:53.159 "dma_device_id": "system", 00:12:53.159 "dma_device_type": 1 00:12:53.159 }, 00:12:53.159 { 00:12:53.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.159 "dma_device_type": 2 00:12:53.159 } 00:12:53.159 ], 00:12:53.159 "driver_specific": {} 00:12:53.159 }' 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.159 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:53.417 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:53.676 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:53.676 "name": "BaseBdev2", 00:12:53.676 "aliases": [ 00:12:53.676 "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a" 00:12:53.676 ], 00:12:53.676 "product_name": "Malloc disk", 00:12:53.676 "block_size": 512, 00:12:53.676 "num_blocks": 65536, 00:12:53.676 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:53.676 "assigned_rate_limits": { 00:12:53.676 "rw_ios_per_sec": 0, 00:12:53.676 "rw_mbytes_per_sec": 0, 00:12:53.676 "r_mbytes_per_sec": 0, 00:12:53.676 "w_mbytes_per_sec": 0 00:12:53.676 }, 00:12:53.676 "claimed": true, 00:12:53.676 "claim_type": "exclusive_write", 00:12:53.676 "zoned": false, 00:12:53.676 "supported_io_types": { 00:12:53.676 "read": true, 00:12:53.676 "write": true, 00:12:53.676 "unmap": true, 00:12:53.676 "write_zeroes": true, 00:12:53.676 "flush": true, 00:12:53.676 "reset": true, 00:12:53.676 "compare": false, 00:12:53.676 "compare_and_write": false, 00:12:53.676 "abort": true, 00:12:53.676 "nvme_admin": false, 00:12:53.676 "nvme_io": false 00:12:53.676 }, 00:12:53.676 "memory_domains": [ 00:12:53.676 { 00:12:53.676 "dma_device_id": "system", 00:12:53.676 "dma_device_type": 1 00:12:53.676 }, 00:12:53.676 { 00:12:53.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.676 "dma_device_type": 2 00:12:53.676 } 00:12:53.676 ], 00:12:53.676 "driver_specific": {} 00:12:53.676 }' 00:12:53.676 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:53.676 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:53.676 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:53.676 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:53.934 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:53.934 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:53.935 11:49:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:53.935 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:53.935 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:54.193 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:54.193 "name": "BaseBdev3", 00:12:54.193 "aliases": [ 00:12:54.193 "e1fa2d0e-e219-46a5-ab51-1b44695dcc79" 00:12:54.193 ], 00:12:54.193 "product_name": "Malloc disk", 00:12:54.193 "block_size": 512, 00:12:54.193 "num_blocks": 65536, 00:12:54.193 "uuid": "e1fa2d0e-e219-46a5-ab51-1b44695dcc79", 00:12:54.193 "assigned_rate_limits": { 00:12:54.193 "rw_ios_per_sec": 0, 00:12:54.193 "rw_mbytes_per_sec": 0, 00:12:54.193 "r_mbytes_per_sec": 0, 00:12:54.193 "w_mbytes_per_sec": 0 00:12:54.193 }, 00:12:54.193 "claimed": true, 00:12:54.193 "claim_type": "exclusive_write", 00:12:54.193 "zoned": false, 00:12:54.193 "supported_io_types": { 00:12:54.193 "read": true, 00:12:54.193 "write": true, 00:12:54.193 "unmap": true, 00:12:54.193 "write_zeroes": true, 00:12:54.193 "flush": true, 00:12:54.193 "reset": true, 00:12:54.193 "compare": false, 00:12:54.193 "compare_and_write": false, 00:12:54.193 "abort": true, 00:12:54.193 "nvme_admin": false, 00:12:54.193 "nvme_io": false 00:12:54.193 }, 00:12:54.193 "memory_domains": [ 00:12:54.193 { 00:12:54.193 "dma_device_id": "system", 00:12:54.193 "dma_device_type": 1 00:12:54.193 }, 00:12:54.193 { 00:12:54.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.193 "dma_device_type": 2 00:12:54.193 } 00:12:54.193 ], 00:12:54.193 "driver_specific": {} 00:12:54.193 }' 00:12:54.193 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:54.451 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:54.451 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.452 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:54.710 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:54.710 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:54.710 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:54.968 [2024-05-14 11:49:21.826391] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:54.968 [2024-05-14 11:49:21.826424] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.968 [2024-05-14 11:49:21.826466] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.968 11:49:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.227 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:55.227 "name": "Existed_Raid", 00:12:55.227 "uuid": "7776f7aa-1a7d-4fda-8bd5-dbcd2ced82ba", 00:12:55.227 "strip_size_kb": 64, 00:12:55.227 "state": "offline", 00:12:55.227 "raid_level": "concat", 00:12:55.227 "superblock": false, 00:12:55.227 "num_base_bdevs": 3, 00:12:55.227 "num_base_bdevs_discovered": 2, 00:12:55.227 "num_base_bdevs_operational": 2, 00:12:55.227 "base_bdevs_list": [ 00:12:55.227 { 00:12:55.227 "name": null, 00:12:55.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.227 "is_configured": false, 00:12:55.227 "data_offset": 0, 00:12:55.227 "data_size": 65536 00:12:55.227 }, 00:12:55.227 { 00:12:55.227 "name": "BaseBdev2", 00:12:55.227 "uuid": "55534f8b-99cc-4afb-8dd5-0147f0fc5c1a", 00:12:55.227 "is_configured": true, 00:12:55.227 "data_offset": 0, 00:12:55.227 "data_size": 65536 00:12:55.227 }, 00:12:55.227 { 00:12:55.227 "name": "BaseBdev3", 00:12:55.227 "uuid": "e1fa2d0e-e219-46a5-ab51-1b44695dcc79", 00:12:55.227 "is_configured": true, 00:12:55.227 "data_offset": 0, 00:12:55.227 "data_size": 65536 00:12:55.227 } 00:12:55.227 ] 00:12:55.227 }' 00:12:55.227 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:55.227 11:49:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.793 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:55.793 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:55.793 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:55.793 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.065 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:56.065 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:56.065 11:49:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:56.324 [2024-05-14 11:49:23.150997] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:56.324 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:56.324 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:56.324 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.324 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:56.583 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:56.583 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:56.583 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:56.583 [2024-05-14 11:49:23.634808] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:56.583 [2024-05-14 11:49:23.634852] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19a1080 name Existed_Raid, state offline 00:12:56.583 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:56.583 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:56.841 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:56.842 11:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:57.099 BaseBdev2 00:12:57.099 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:57.099 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:57.099 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:57.099 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:57.099 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:57.100 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:57.100 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.358 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:57.358 [ 00:12:57.358 { 00:12:57.358 "name": "BaseBdev2", 00:12:57.358 "aliases": [ 00:12:57.358 "8d123ae6-2da3-4c9d-97d3-69c8b726cc91" 00:12:57.358 ], 00:12:57.358 "product_name": "Malloc disk", 00:12:57.358 "block_size": 512, 00:12:57.358 "num_blocks": 65536, 00:12:57.358 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:12:57.358 "assigned_rate_limits": { 00:12:57.358 "rw_ios_per_sec": 0, 00:12:57.358 "rw_mbytes_per_sec": 0, 00:12:57.358 "r_mbytes_per_sec": 0, 00:12:57.358 "w_mbytes_per_sec": 0 00:12:57.358 }, 00:12:57.358 "claimed": false, 00:12:57.358 "zoned": false, 00:12:57.358 "supported_io_types": { 00:12:57.358 "read": true, 00:12:57.358 "write": true, 00:12:57.358 "unmap": true, 00:12:57.358 "write_zeroes": true, 00:12:57.358 "flush": true, 00:12:57.358 "reset": true, 00:12:57.358 "compare": false, 00:12:57.358 "compare_and_write": false, 00:12:57.358 "abort": true, 00:12:57.358 "nvme_admin": false, 00:12:57.358 "nvme_io": false 00:12:57.358 }, 00:12:57.358 "memory_domains": [ 00:12:57.358 { 00:12:57.358 "dma_device_id": "system", 00:12:57.358 "dma_device_type": 1 00:12:57.358 }, 00:12:57.358 { 00:12:57.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.358 "dma_device_type": 2 00:12:57.358 } 00:12:57.358 ], 00:12:57.358 "driver_specific": {} 00:12:57.358 } 00:12:57.358 ] 00:12:57.358 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:57.358 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:57.358 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:57.358 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:57.616 BaseBdev3 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:57.616 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.910 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:57.910 [ 00:12:57.910 { 00:12:57.910 "name": "BaseBdev3", 00:12:57.910 "aliases": [ 00:12:57.910 "c8de52e8-f297-4cc9-9eea-d81f31b27473" 00:12:57.910 ], 00:12:57.910 "product_name": "Malloc disk", 00:12:57.910 "block_size": 512, 00:12:57.910 "num_blocks": 65536, 00:12:57.910 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:12:57.910 "assigned_rate_limits": { 00:12:57.910 "rw_ios_per_sec": 0, 00:12:57.910 "rw_mbytes_per_sec": 0, 00:12:57.910 "r_mbytes_per_sec": 0, 00:12:57.910 "w_mbytes_per_sec": 0 00:12:57.910 }, 00:12:57.910 "claimed": false, 00:12:57.910 "zoned": false, 00:12:57.910 "supported_io_types": { 00:12:57.910 "read": true, 00:12:57.910 "write": true, 00:12:57.910 "unmap": true, 00:12:57.910 "write_zeroes": true, 00:12:57.910 "flush": true, 00:12:57.910 "reset": true, 00:12:57.910 "compare": false, 00:12:57.910 "compare_and_write": false, 00:12:57.910 "abort": true, 00:12:57.910 "nvme_admin": false, 00:12:57.910 "nvme_io": false 00:12:57.910 }, 00:12:57.910 "memory_domains": [ 00:12:57.910 { 00:12:57.910 "dma_device_id": "system", 00:12:57.910 "dma_device_type": 1 00:12:57.910 }, 00:12:57.910 { 00:12:57.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.910 "dma_device_type": 2 00:12:57.910 } 00:12:57.910 ], 00:12:57.910 "driver_specific": {} 00:12:57.910 } 00:12:57.910 ] 00:12:57.910 11:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:57.910 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:57.910 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:57.910 11:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:58.186 [2024-05-14 11:49:25.075720] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:58.186 [2024-05-14 11:49:25.075760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:58.186 [2024-05-14 11:49:25.075780] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:58.186 [2024-05-14 11:49:25.077163] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.186 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.445 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:58.445 "name": "Existed_Raid", 00:12:58.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.445 "strip_size_kb": 64, 00:12:58.445 "state": "configuring", 00:12:58.445 "raid_level": "concat", 00:12:58.445 "superblock": false, 00:12:58.445 "num_base_bdevs": 3, 00:12:58.445 "num_base_bdevs_discovered": 2, 00:12:58.445 "num_base_bdevs_operational": 3, 00:12:58.445 "base_bdevs_list": [ 00:12:58.445 { 00:12:58.445 "name": "BaseBdev1", 00:12:58.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.445 "is_configured": false, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 0 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "name": "BaseBdev2", 00:12:58.445 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:12:58.445 "is_configured": true, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 65536 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "name": "BaseBdev3", 00:12:58.445 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:12:58.445 "is_configured": true, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 65536 00:12:58.445 } 00:12:58.445 ] 00:12:58.445 }' 00:12:58.445 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:58.445 11:49:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.011 11:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:59.269 [2024-05-14 11:49:26.154552] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:59.269 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:59.269 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.270 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.528 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:59.528 "name": "Existed_Raid", 00:12:59.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.528 "strip_size_kb": 64, 00:12:59.528 "state": "configuring", 00:12:59.528 "raid_level": "concat", 00:12:59.528 "superblock": false, 00:12:59.529 "num_base_bdevs": 3, 00:12:59.529 "num_base_bdevs_discovered": 1, 00:12:59.529 "num_base_bdevs_operational": 3, 00:12:59.529 "base_bdevs_list": [ 00:12:59.529 { 00:12:59.529 "name": "BaseBdev1", 00:12:59.529 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.529 "is_configured": false, 00:12:59.529 "data_offset": 0, 00:12:59.529 "data_size": 0 00:12:59.529 }, 00:12:59.529 { 00:12:59.529 "name": null, 00:12:59.529 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:12:59.529 "is_configured": false, 00:12:59.529 "data_offset": 0, 00:12:59.529 "data_size": 65536 00:12:59.529 }, 00:12:59.529 { 00:12:59.529 "name": "BaseBdev3", 00:12:59.529 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:12:59.529 "is_configured": true, 00:12:59.529 "data_offset": 0, 00:12:59.529 "data_size": 65536 00:12:59.529 } 00:12:59.529 ] 00:12:59.529 }' 00:12:59.529 11:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:59.529 11:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.095 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.095 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:00.353 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:13:00.353 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:00.611 [2024-05-14 11:49:27.493593] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:00.611 BaseBdev1 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:00.611 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:00.612 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:00.869 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:01.128 [ 00:13:01.128 { 00:13:01.128 "name": "BaseBdev1", 00:13:01.128 "aliases": [ 00:13:01.128 "f9c718f8-dbe7-48ff-a254-d3322dc405db" 00:13:01.128 ], 00:13:01.128 "product_name": "Malloc disk", 00:13:01.128 "block_size": 512, 00:13:01.128 "num_blocks": 65536, 00:13:01.128 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:01.128 "assigned_rate_limits": { 00:13:01.128 "rw_ios_per_sec": 0, 00:13:01.128 "rw_mbytes_per_sec": 0, 00:13:01.128 "r_mbytes_per_sec": 0, 00:13:01.128 "w_mbytes_per_sec": 0 00:13:01.128 }, 00:13:01.128 "claimed": true, 00:13:01.128 "claim_type": "exclusive_write", 00:13:01.128 "zoned": false, 00:13:01.128 "supported_io_types": { 00:13:01.128 "read": true, 00:13:01.128 "write": true, 00:13:01.128 "unmap": true, 00:13:01.128 "write_zeroes": true, 00:13:01.128 "flush": true, 00:13:01.128 "reset": true, 00:13:01.128 "compare": false, 00:13:01.128 "compare_and_write": false, 00:13:01.128 "abort": true, 00:13:01.128 "nvme_admin": false, 00:13:01.128 "nvme_io": false 00:13:01.128 }, 00:13:01.128 "memory_domains": [ 00:13:01.128 { 00:13:01.128 "dma_device_id": "system", 00:13:01.128 "dma_device_type": 1 00:13:01.128 }, 00:13:01.128 { 00:13:01.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.128 "dma_device_type": 2 00:13:01.128 } 00:13:01.128 ], 00:13:01.128 "driver_specific": {} 00:13:01.128 } 00:13:01.128 ] 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.128 11:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.387 11:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:01.387 "name": "Existed_Raid", 00:13:01.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.387 "strip_size_kb": 64, 00:13:01.387 "state": "configuring", 00:13:01.387 "raid_level": "concat", 00:13:01.387 "superblock": false, 00:13:01.387 "num_base_bdevs": 3, 00:13:01.387 "num_base_bdevs_discovered": 2, 00:13:01.387 "num_base_bdevs_operational": 3, 00:13:01.387 "base_bdevs_list": [ 00:13:01.387 { 00:13:01.387 "name": "BaseBdev1", 00:13:01.387 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:01.387 "is_configured": true, 00:13:01.387 "data_offset": 0, 00:13:01.387 "data_size": 65536 00:13:01.387 }, 00:13:01.387 { 00:13:01.387 "name": null, 00:13:01.387 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:01.387 "is_configured": false, 00:13:01.387 "data_offset": 0, 00:13:01.387 "data_size": 65536 00:13:01.387 }, 00:13:01.387 { 00:13:01.387 "name": "BaseBdev3", 00:13:01.387 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:01.387 "is_configured": true, 00:13:01.387 "data_offset": 0, 00:13:01.387 "data_size": 65536 00:13:01.387 } 00:13:01.387 ] 00:13:01.387 }' 00:13:01.387 11:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:01.387 11:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.954 11:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:01.954 11:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.213 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:13:02.213 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:02.213 [2024-05-14 11:49:29.286382] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.472 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.730 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:02.730 "name": "Existed_Raid", 00:13:02.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.730 "strip_size_kb": 64, 00:13:02.730 "state": "configuring", 00:13:02.730 "raid_level": "concat", 00:13:02.730 "superblock": false, 00:13:02.730 "num_base_bdevs": 3, 00:13:02.730 "num_base_bdevs_discovered": 1, 00:13:02.730 "num_base_bdevs_operational": 3, 00:13:02.730 "base_bdevs_list": [ 00:13:02.730 { 00:13:02.730 "name": "BaseBdev1", 00:13:02.730 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:02.730 "is_configured": true, 00:13:02.730 "data_offset": 0, 00:13:02.730 "data_size": 65536 00:13:02.730 }, 00:13:02.730 { 00:13:02.730 "name": null, 00:13:02.730 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:02.730 "is_configured": false, 00:13:02.730 "data_offset": 0, 00:13:02.730 "data_size": 65536 00:13:02.730 }, 00:13:02.730 { 00:13:02.730 "name": null, 00:13:02.730 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:02.730 "is_configured": false, 00:13:02.730 "data_offset": 0, 00:13:02.730 "data_size": 65536 00:13:02.730 } 00:13:02.730 ] 00:13:02.730 }' 00:13:02.730 11:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:02.730 11:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.296 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.296 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:03.554 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:13:03.554 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:03.554 [2024-05-14 11:49:30.625943] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:03.813 "name": "Existed_Raid", 00:13:03.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.813 "strip_size_kb": 64, 00:13:03.813 "state": "configuring", 00:13:03.813 "raid_level": "concat", 00:13:03.813 "superblock": false, 00:13:03.813 "num_base_bdevs": 3, 00:13:03.813 "num_base_bdevs_discovered": 2, 00:13:03.813 "num_base_bdevs_operational": 3, 00:13:03.813 "base_bdevs_list": [ 00:13:03.813 { 00:13:03.813 "name": "BaseBdev1", 00:13:03.813 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:03.813 "is_configured": true, 00:13:03.813 "data_offset": 0, 00:13:03.813 "data_size": 65536 00:13:03.813 }, 00:13:03.813 { 00:13:03.813 "name": null, 00:13:03.813 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:03.813 "is_configured": false, 00:13:03.813 "data_offset": 0, 00:13:03.813 "data_size": 65536 00:13:03.813 }, 00:13:03.813 { 00:13:03.813 "name": "BaseBdev3", 00:13:03.813 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:03.813 "is_configured": true, 00:13:03.813 "data_offset": 0, 00:13:03.813 "data_size": 65536 00:13:03.813 } 00:13:03.813 ] 00:13:03.813 }' 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:03.813 11:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.749 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.749 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:04.749 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:13:04.749 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:05.008 [2024-05-14 11:49:31.937455] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.008 11:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.267 11:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:05.267 "name": "Existed_Raid", 00:13:05.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.267 "strip_size_kb": 64, 00:13:05.267 "state": "configuring", 00:13:05.267 "raid_level": "concat", 00:13:05.267 "superblock": false, 00:13:05.267 "num_base_bdevs": 3, 00:13:05.267 "num_base_bdevs_discovered": 1, 00:13:05.267 "num_base_bdevs_operational": 3, 00:13:05.267 "base_bdevs_list": [ 00:13:05.267 { 00:13:05.267 "name": null, 00:13:05.267 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:05.267 "is_configured": false, 00:13:05.267 "data_offset": 0, 00:13:05.267 "data_size": 65536 00:13:05.267 }, 00:13:05.267 { 00:13:05.267 "name": null, 00:13:05.267 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:05.267 "is_configured": false, 00:13:05.267 "data_offset": 0, 00:13:05.267 "data_size": 65536 00:13:05.267 }, 00:13:05.267 { 00:13:05.267 "name": "BaseBdev3", 00:13:05.267 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:05.267 "is_configured": true, 00:13:05.267 "data_offset": 0, 00:13:05.267 "data_size": 65536 00:13:05.267 } 00:13:05.267 ] 00:13:05.267 }' 00:13:05.267 11:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:05.267 11:49:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.833 11:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.833 11:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:06.091 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:13:06.091 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:06.349 [2024-05-14 11:49:33.267580] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.350 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.608 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:06.609 "name": "Existed_Raid", 00:13:06.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.609 "strip_size_kb": 64, 00:13:06.609 "state": "configuring", 00:13:06.609 "raid_level": "concat", 00:13:06.609 "superblock": false, 00:13:06.609 "num_base_bdevs": 3, 00:13:06.609 "num_base_bdevs_discovered": 2, 00:13:06.609 "num_base_bdevs_operational": 3, 00:13:06.609 "base_bdevs_list": [ 00:13:06.609 { 00:13:06.609 "name": null, 00:13:06.609 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:06.609 "is_configured": false, 00:13:06.609 "data_offset": 0, 00:13:06.609 "data_size": 65536 00:13:06.609 }, 00:13:06.609 { 00:13:06.609 "name": "BaseBdev2", 00:13:06.609 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:06.609 "is_configured": true, 00:13:06.609 "data_offset": 0, 00:13:06.609 "data_size": 65536 00:13:06.609 }, 00:13:06.609 { 00:13:06.609 "name": "BaseBdev3", 00:13:06.609 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:06.609 "is_configured": true, 00:13:06.609 "data_offset": 0, 00:13:06.609 "data_size": 65536 00:13:06.609 } 00:13:06.609 ] 00:13:06.609 }' 00:13:06.609 11:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:06.609 11:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.174 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:07.175 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.433 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:13:07.433 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.433 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:07.692 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f9c718f8-dbe7-48ff-a254-d3322dc405db 00:13:07.952 [2024-05-14 11:49:34.816254] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:07.952 [2024-05-14 11:49:34.816293] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b44ec0 00:13:07.952 [2024-05-14 11:49:34.816302] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:07.952 [2024-05-14 11:49:34.816506] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b49f00 00:13:07.952 [2024-05-14 11:49:34.816633] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b44ec0 00:13:07.952 [2024-05-14 11:49:34.816643] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b44ec0 00:13:07.952 [2024-05-14 11:49:34.816807] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:07.952 NewBaseBdev 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:07.952 11:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.211 11:49:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:08.469 [ 00:13:08.469 { 00:13:08.469 "name": "NewBaseBdev", 00:13:08.469 "aliases": [ 00:13:08.469 "f9c718f8-dbe7-48ff-a254-d3322dc405db" 00:13:08.469 ], 00:13:08.469 "product_name": "Malloc disk", 00:13:08.469 "block_size": 512, 00:13:08.469 "num_blocks": 65536, 00:13:08.469 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:08.469 "assigned_rate_limits": { 00:13:08.469 "rw_ios_per_sec": 0, 00:13:08.469 "rw_mbytes_per_sec": 0, 00:13:08.469 "r_mbytes_per_sec": 0, 00:13:08.469 "w_mbytes_per_sec": 0 00:13:08.469 }, 00:13:08.469 "claimed": true, 00:13:08.469 "claim_type": "exclusive_write", 00:13:08.469 "zoned": false, 00:13:08.469 "supported_io_types": { 00:13:08.469 "read": true, 00:13:08.469 "write": true, 00:13:08.469 "unmap": true, 00:13:08.469 "write_zeroes": true, 00:13:08.469 "flush": true, 00:13:08.469 "reset": true, 00:13:08.469 "compare": false, 00:13:08.469 "compare_and_write": false, 00:13:08.469 "abort": true, 00:13:08.469 "nvme_admin": false, 00:13:08.469 "nvme_io": false 00:13:08.469 }, 00:13:08.469 "memory_domains": [ 00:13:08.469 { 00:13:08.469 "dma_device_id": "system", 00:13:08.469 "dma_device_type": 1 00:13:08.469 }, 00:13:08.469 { 00:13:08.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.469 "dma_device_type": 2 00:13:08.469 } 00:13:08.469 ], 00:13:08.469 "driver_specific": {} 00:13:08.469 } 00:13:08.469 ] 00:13:08.469 11:49:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:08.469 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:08.469 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:08.469 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:08.470 "name": "Existed_Raid", 00:13:08.470 "uuid": "c661f3dd-1ec5-4ac5-bc3f-3dab4427d7dd", 00:13:08.470 "strip_size_kb": 64, 00:13:08.470 "state": "online", 00:13:08.470 "raid_level": "concat", 00:13:08.470 "superblock": false, 00:13:08.470 "num_base_bdevs": 3, 00:13:08.470 "num_base_bdevs_discovered": 3, 00:13:08.470 "num_base_bdevs_operational": 3, 00:13:08.470 "base_bdevs_list": [ 00:13:08.470 { 00:13:08.470 "name": "NewBaseBdev", 00:13:08.470 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:08.470 "is_configured": true, 00:13:08.470 "data_offset": 0, 00:13:08.470 "data_size": 65536 00:13:08.470 }, 00:13:08.470 { 00:13:08.470 "name": "BaseBdev2", 00:13:08.470 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:08.470 "is_configured": true, 00:13:08.470 "data_offset": 0, 00:13:08.470 "data_size": 65536 00:13:08.470 }, 00:13:08.470 { 00:13:08.470 "name": "BaseBdev3", 00:13:08.470 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:08.470 "is_configured": true, 00:13:08.470 "data_offset": 0, 00:13:08.470 "data_size": 65536 00:13:08.470 } 00:13:08.470 ] 00:13:08.470 }' 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:08.470 11:49:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:09.407 [2024-05-14 11:49:36.304501] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:09.407 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:09.407 "name": "Existed_Raid", 00:13:09.407 "aliases": [ 00:13:09.407 "c661f3dd-1ec5-4ac5-bc3f-3dab4427d7dd" 00:13:09.407 ], 00:13:09.407 "product_name": "Raid Volume", 00:13:09.407 "block_size": 512, 00:13:09.407 "num_blocks": 196608, 00:13:09.407 "uuid": "c661f3dd-1ec5-4ac5-bc3f-3dab4427d7dd", 00:13:09.407 "assigned_rate_limits": { 00:13:09.407 "rw_ios_per_sec": 0, 00:13:09.407 "rw_mbytes_per_sec": 0, 00:13:09.407 "r_mbytes_per_sec": 0, 00:13:09.407 "w_mbytes_per_sec": 0 00:13:09.407 }, 00:13:09.407 "claimed": false, 00:13:09.408 "zoned": false, 00:13:09.408 "supported_io_types": { 00:13:09.408 "read": true, 00:13:09.408 "write": true, 00:13:09.408 "unmap": true, 00:13:09.408 "write_zeroes": true, 00:13:09.408 "flush": true, 00:13:09.408 "reset": true, 00:13:09.408 "compare": false, 00:13:09.408 "compare_and_write": false, 00:13:09.408 "abort": false, 00:13:09.408 "nvme_admin": false, 00:13:09.408 "nvme_io": false 00:13:09.408 }, 00:13:09.408 "memory_domains": [ 00:13:09.408 { 00:13:09.408 "dma_device_id": "system", 00:13:09.408 "dma_device_type": 1 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.408 "dma_device_type": 2 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "dma_device_id": "system", 00:13:09.408 "dma_device_type": 1 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.408 "dma_device_type": 2 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "dma_device_id": "system", 00:13:09.408 "dma_device_type": 1 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.408 "dma_device_type": 2 00:13:09.408 } 00:13:09.408 ], 00:13:09.408 "driver_specific": { 00:13:09.408 "raid": { 00:13:09.408 "uuid": "c661f3dd-1ec5-4ac5-bc3f-3dab4427d7dd", 00:13:09.408 "strip_size_kb": 64, 00:13:09.408 "state": "online", 00:13:09.408 "raid_level": "concat", 00:13:09.408 "superblock": false, 00:13:09.408 "num_base_bdevs": 3, 00:13:09.408 "num_base_bdevs_discovered": 3, 00:13:09.408 "num_base_bdevs_operational": 3, 00:13:09.408 "base_bdevs_list": [ 00:13:09.408 { 00:13:09.408 "name": "NewBaseBdev", 00:13:09.408 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:09.408 "is_configured": true, 00:13:09.408 "data_offset": 0, 00:13:09.408 "data_size": 65536 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "name": "BaseBdev2", 00:13:09.408 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:09.408 "is_configured": true, 00:13:09.408 "data_offset": 0, 00:13:09.408 "data_size": 65536 00:13:09.408 }, 00:13:09.408 { 00:13:09.408 "name": "BaseBdev3", 00:13:09.408 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:09.408 "is_configured": true, 00:13:09.408 "data_offset": 0, 00:13:09.408 "data_size": 65536 00:13:09.408 } 00:13:09.408 ] 00:13:09.408 } 00:13:09.408 } 00:13:09.408 }' 00:13:09.408 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:09.408 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:13:09.408 BaseBdev2 00:13:09.408 BaseBdev3' 00:13:09.408 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:09.408 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:09.408 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:09.666 "name": "NewBaseBdev", 00:13:09.666 "aliases": [ 00:13:09.666 "f9c718f8-dbe7-48ff-a254-d3322dc405db" 00:13:09.666 ], 00:13:09.666 "product_name": "Malloc disk", 00:13:09.666 "block_size": 512, 00:13:09.666 "num_blocks": 65536, 00:13:09.666 "uuid": "f9c718f8-dbe7-48ff-a254-d3322dc405db", 00:13:09.666 "assigned_rate_limits": { 00:13:09.666 "rw_ios_per_sec": 0, 00:13:09.666 "rw_mbytes_per_sec": 0, 00:13:09.666 "r_mbytes_per_sec": 0, 00:13:09.666 "w_mbytes_per_sec": 0 00:13:09.666 }, 00:13:09.666 "claimed": true, 00:13:09.666 "claim_type": "exclusive_write", 00:13:09.666 "zoned": false, 00:13:09.666 "supported_io_types": { 00:13:09.666 "read": true, 00:13:09.666 "write": true, 00:13:09.666 "unmap": true, 00:13:09.666 "write_zeroes": true, 00:13:09.666 "flush": true, 00:13:09.666 "reset": true, 00:13:09.666 "compare": false, 00:13:09.666 "compare_and_write": false, 00:13:09.666 "abort": true, 00:13:09.666 "nvme_admin": false, 00:13:09.666 "nvme_io": false 00:13:09.666 }, 00:13:09.666 "memory_domains": [ 00:13:09.666 { 00:13:09.666 "dma_device_id": "system", 00:13:09.666 "dma_device_type": 1 00:13:09.666 }, 00:13:09.666 { 00:13:09.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.666 "dma_device_type": 2 00:13:09.666 } 00:13:09.666 ], 00:13:09.666 "driver_specific": {} 00:13:09.666 }' 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:09.666 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:09.924 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:09.925 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:09.925 11:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:10.184 "name": "BaseBdev2", 00:13:10.184 "aliases": [ 00:13:10.184 "8d123ae6-2da3-4c9d-97d3-69c8b726cc91" 00:13:10.184 ], 00:13:10.184 "product_name": "Malloc disk", 00:13:10.184 "block_size": 512, 00:13:10.184 "num_blocks": 65536, 00:13:10.184 "uuid": "8d123ae6-2da3-4c9d-97d3-69c8b726cc91", 00:13:10.184 "assigned_rate_limits": { 00:13:10.184 "rw_ios_per_sec": 0, 00:13:10.184 "rw_mbytes_per_sec": 0, 00:13:10.184 "r_mbytes_per_sec": 0, 00:13:10.184 "w_mbytes_per_sec": 0 00:13:10.184 }, 00:13:10.184 "claimed": true, 00:13:10.184 "claim_type": "exclusive_write", 00:13:10.184 "zoned": false, 00:13:10.184 "supported_io_types": { 00:13:10.184 "read": true, 00:13:10.184 "write": true, 00:13:10.184 "unmap": true, 00:13:10.184 "write_zeroes": true, 00:13:10.184 "flush": true, 00:13:10.184 "reset": true, 00:13:10.184 "compare": false, 00:13:10.184 "compare_and_write": false, 00:13:10.184 "abort": true, 00:13:10.184 "nvme_admin": false, 00:13:10.184 "nvme_io": false 00:13:10.184 }, 00:13:10.184 "memory_domains": [ 00:13:10.184 { 00:13:10.184 "dma_device_id": "system", 00:13:10.184 "dma_device_type": 1 00:13:10.184 }, 00:13:10.184 { 00:13:10.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.184 "dma_device_type": 2 00:13:10.184 } 00:13:10.184 ], 00:13:10.184 "driver_specific": {} 00:13:10.184 }' 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.184 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:10.443 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:10.702 "name": "BaseBdev3", 00:13:10.702 "aliases": [ 00:13:10.702 "c8de52e8-f297-4cc9-9eea-d81f31b27473" 00:13:10.702 ], 00:13:10.702 "product_name": "Malloc disk", 00:13:10.702 "block_size": 512, 00:13:10.702 "num_blocks": 65536, 00:13:10.702 "uuid": "c8de52e8-f297-4cc9-9eea-d81f31b27473", 00:13:10.702 "assigned_rate_limits": { 00:13:10.702 "rw_ios_per_sec": 0, 00:13:10.702 "rw_mbytes_per_sec": 0, 00:13:10.702 "r_mbytes_per_sec": 0, 00:13:10.702 "w_mbytes_per_sec": 0 00:13:10.702 }, 00:13:10.702 "claimed": true, 00:13:10.702 "claim_type": "exclusive_write", 00:13:10.702 "zoned": false, 00:13:10.702 "supported_io_types": { 00:13:10.702 "read": true, 00:13:10.702 "write": true, 00:13:10.702 "unmap": true, 00:13:10.702 "write_zeroes": true, 00:13:10.702 "flush": true, 00:13:10.702 "reset": true, 00:13:10.702 "compare": false, 00:13:10.702 "compare_and_write": false, 00:13:10.702 "abort": true, 00:13:10.702 "nvme_admin": false, 00:13:10.702 "nvme_io": false 00:13:10.702 }, 00:13:10.702 "memory_domains": [ 00:13:10.702 { 00:13:10.702 "dma_device_id": "system", 00:13:10.702 "dma_device_type": 1 00:13:10.702 }, 00:13:10.702 { 00:13:10.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.702 "dma_device_type": 2 00:13:10.702 } 00:13:10.702 ], 00:13:10.702 "driver_specific": {} 00:13:10.702 }' 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.702 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.961 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.961 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.961 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.961 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:10.961 11:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.220 [2024-05-14 11:49:38.100995] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.220 [2024-05-14 11:49:38.101022] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:11.220 [2024-05-14 11:49:38.101074] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:11.220 [2024-05-14 11:49:38.101125] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:11.220 [2024-05-14 11:49:38.101137] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b44ec0 name Existed_Raid, state offline 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1689607 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1689607 ']' 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1689607 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1689607 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:11.220 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1689607' 00:13:11.220 killing process with pid 1689607 00:13:11.221 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1689607 00:13:11.221 [2024-05-14 11:49:38.171091] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:11.221 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1689607 00:13:11.221 [2024-05-14 11:49:38.200806] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:13:11.480 00:13:11.480 real 0m27.271s 00:13:11.480 user 0m50.013s 00:13:11.480 sys 0m4.866s 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.480 ************************************ 00:13:11.480 END TEST raid_state_function_test 00:13:11.480 ************************************ 00:13:11.480 11:49:38 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:13:11.480 11:49:38 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:11.480 11:49:38 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:11.480 11:49:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:11.480 ************************************ 00:13:11.480 START TEST raid_state_function_test_sb 00:13:11.480 ************************************ 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 true 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1693736 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1693736' 00:13:11.480 Process raid pid: 1693736 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1693736 /var/tmp/spdk-raid.sock 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1693736 ']' 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:11.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:11.480 11:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.739 [2024-05-14 11:49:38.615249] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:13:11.739 [2024-05-14 11:49:38.615319] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:11.739 [2024-05-14 11:49:38.748405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:11.998 [2024-05-14 11:49:38.856539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:11.998 [2024-05-14 11:49:38.920949] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:11.998 [2024-05-14 11:49:38.920990] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.565 11:49:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:12.565 11:49:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:13:12.565 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:12.825 [2024-05-14 11:49:39.708030] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:12.825 [2024-05-14 11:49:39.708071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:12.825 [2024-05-14 11:49:39.708083] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:12.825 [2024-05-14 11:49:39.708094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:12.825 [2024-05-14 11:49:39.708103] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:12.825 [2024-05-14 11:49:39.708114] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.825 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.119 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:13.119 "name": "Existed_Raid", 00:13:13.119 "uuid": "51261177-3d4d-4f92-b266-7a165fda8415", 00:13:13.119 "strip_size_kb": 64, 00:13:13.119 "state": "configuring", 00:13:13.119 "raid_level": "concat", 00:13:13.119 "superblock": true, 00:13:13.119 "num_base_bdevs": 3, 00:13:13.119 "num_base_bdevs_discovered": 0, 00:13:13.119 "num_base_bdevs_operational": 3, 00:13:13.119 "base_bdevs_list": [ 00:13:13.119 { 00:13:13.119 "name": "BaseBdev1", 00:13:13.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.119 "is_configured": false, 00:13:13.119 "data_offset": 0, 00:13:13.119 "data_size": 0 00:13:13.119 }, 00:13:13.119 { 00:13:13.119 "name": "BaseBdev2", 00:13:13.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.119 "is_configured": false, 00:13:13.120 "data_offset": 0, 00:13:13.120 "data_size": 0 00:13:13.120 }, 00:13:13.120 { 00:13:13.120 "name": "BaseBdev3", 00:13:13.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.120 "is_configured": false, 00:13:13.120 "data_offset": 0, 00:13:13.120 "data_size": 0 00:13:13.120 } 00:13:13.120 ] 00:13:13.120 }' 00:13:13.120 11:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:13.120 11:49:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:13.378 11:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:13.638 [2024-05-14 11:49:40.646359] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:13.638 [2024-05-14 11:49:40.646393] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2581700 name Existed_Raid, state configuring 00:13:13.638 11:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:13.897 [2024-05-14 11:49:40.891041] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:13.897 [2024-05-14 11:49:40.891071] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:13.897 [2024-05-14 11:49:40.891081] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:13.897 [2024-05-14 11:49:40.891093] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:13.897 [2024-05-14 11:49:40.891101] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:13.897 [2024-05-14 11:49:40.891112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:13.897 11:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:14.157 [2024-05-14 11:49:41.149529] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:14.157 BaseBdev1 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:14.157 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.416 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:14.689 [ 00:13:14.689 { 00:13:14.689 "name": "BaseBdev1", 00:13:14.689 "aliases": [ 00:13:14.689 "7c31222b-a13e-4b10-b843-cf06a6144216" 00:13:14.689 ], 00:13:14.689 "product_name": "Malloc disk", 00:13:14.689 "block_size": 512, 00:13:14.689 "num_blocks": 65536, 00:13:14.689 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:14.689 "assigned_rate_limits": { 00:13:14.689 "rw_ios_per_sec": 0, 00:13:14.689 "rw_mbytes_per_sec": 0, 00:13:14.689 "r_mbytes_per_sec": 0, 00:13:14.689 "w_mbytes_per_sec": 0 00:13:14.689 }, 00:13:14.690 "claimed": true, 00:13:14.690 "claim_type": "exclusive_write", 00:13:14.690 "zoned": false, 00:13:14.690 "supported_io_types": { 00:13:14.690 "read": true, 00:13:14.690 "write": true, 00:13:14.690 "unmap": true, 00:13:14.690 "write_zeroes": true, 00:13:14.690 "flush": true, 00:13:14.690 "reset": true, 00:13:14.690 "compare": false, 00:13:14.690 "compare_and_write": false, 00:13:14.690 "abort": true, 00:13:14.690 "nvme_admin": false, 00:13:14.690 "nvme_io": false 00:13:14.690 }, 00:13:14.690 "memory_domains": [ 00:13:14.690 { 00:13:14.690 "dma_device_id": "system", 00:13:14.690 "dma_device_type": 1 00:13:14.690 }, 00:13:14.690 { 00:13:14.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.690 "dma_device_type": 2 00:13:14.690 } 00:13:14.690 ], 00:13:14.690 "driver_specific": {} 00:13:14.690 } 00:13:14.690 ] 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.690 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.976 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:14.976 "name": "Existed_Raid", 00:13:14.976 "uuid": "2f5746f3-c4c1-4e4b-a7b9-e5d9ff5412f9", 00:13:14.976 "strip_size_kb": 64, 00:13:14.976 "state": "configuring", 00:13:14.976 "raid_level": "concat", 00:13:14.976 "superblock": true, 00:13:14.976 "num_base_bdevs": 3, 00:13:14.976 "num_base_bdevs_discovered": 1, 00:13:14.976 "num_base_bdevs_operational": 3, 00:13:14.976 "base_bdevs_list": [ 00:13:14.976 { 00:13:14.976 "name": "BaseBdev1", 00:13:14.976 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:14.976 "is_configured": true, 00:13:14.976 "data_offset": 2048, 00:13:14.976 "data_size": 63488 00:13:14.976 }, 00:13:14.976 { 00:13:14.976 "name": "BaseBdev2", 00:13:14.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.976 "is_configured": false, 00:13:14.976 "data_offset": 0, 00:13:14.976 "data_size": 0 00:13:14.976 }, 00:13:14.976 { 00:13:14.976 "name": "BaseBdev3", 00:13:14.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.976 "is_configured": false, 00:13:14.976 "data_offset": 0, 00:13:14.976 "data_size": 0 00:13:14.976 } 00:13:14.976 ] 00:13:14.976 }' 00:13:14.976 11:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:14.976 11:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:15.545 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:15.545 [2024-05-14 11:49:42.513289] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:15.545 [2024-05-14 11:49:42.513332] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2580ff0 name Existed_Raid, state configuring 00:13:15.545 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:15.804 [2024-05-14 11:49:42.762156] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:15.804 [2024-05-14 11:49:42.763652] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:15.804 [2024-05-14 11:49:42.763685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:15.804 [2024-05-14 11:49:42.763696] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:15.804 [2024-05-14 11:49:42.763708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.804 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.064 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:16.064 "name": "Existed_Raid", 00:13:16.064 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:16.064 "strip_size_kb": 64, 00:13:16.064 "state": "configuring", 00:13:16.064 "raid_level": "concat", 00:13:16.064 "superblock": true, 00:13:16.064 "num_base_bdevs": 3, 00:13:16.064 "num_base_bdevs_discovered": 1, 00:13:16.064 "num_base_bdevs_operational": 3, 00:13:16.064 "base_bdevs_list": [ 00:13:16.064 { 00:13:16.064 "name": "BaseBdev1", 00:13:16.064 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:16.064 "is_configured": true, 00:13:16.064 "data_offset": 2048, 00:13:16.064 "data_size": 63488 00:13:16.064 }, 00:13:16.064 { 00:13:16.064 "name": "BaseBdev2", 00:13:16.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.064 "is_configured": false, 00:13:16.064 "data_offset": 0, 00:13:16.064 "data_size": 0 00:13:16.064 }, 00:13:16.064 { 00:13:16.064 "name": "BaseBdev3", 00:13:16.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.064 "is_configured": false, 00:13:16.064 "data_offset": 0, 00:13:16.064 "data_size": 0 00:13:16.064 } 00:13:16.064 ] 00:13:16.064 }' 00:13:16.064 11:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:16.064 11:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:16.631 [2024-05-14 11:49:43.599739] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:16.631 BaseBdev2 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:16.631 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.891 11:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:17.150 [ 00:13:17.150 { 00:13:17.150 "name": "BaseBdev2", 00:13:17.150 "aliases": [ 00:13:17.150 "46b32701-ce84-4950-b1aa-2fbd1ed9d539" 00:13:17.150 ], 00:13:17.150 "product_name": "Malloc disk", 00:13:17.150 "block_size": 512, 00:13:17.150 "num_blocks": 65536, 00:13:17.150 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:17.150 "assigned_rate_limits": { 00:13:17.150 "rw_ios_per_sec": 0, 00:13:17.150 "rw_mbytes_per_sec": 0, 00:13:17.150 "r_mbytes_per_sec": 0, 00:13:17.150 "w_mbytes_per_sec": 0 00:13:17.150 }, 00:13:17.150 "claimed": true, 00:13:17.150 "claim_type": "exclusive_write", 00:13:17.150 "zoned": false, 00:13:17.150 "supported_io_types": { 00:13:17.150 "read": true, 00:13:17.150 "write": true, 00:13:17.150 "unmap": true, 00:13:17.150 "write_zeroes": true, 00:13:17.150 "flush": true, 00:13:17.150 "reset": true, 00:13:17.150 "compare": false, 00:13:17.150 "compare_and_write": false, 00:13:17.150 "abort": true, 00:13:17.150 "nvme_admin": false, 00:13:17.150 "nvme_io": false 00:13:17.150 }, 00:13:17.150 "memory_domains": [ 00:13:17.150 { 00:13:17.150 "dma_device_id": "system", 00:13:17.150 "dma_device_type": 1 00:13:17.150 }, 00:13:17.150 { 00:13:17.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.150 "dma_device_type": 2 00:13:17.150 } 00:13:17.150 ], 00:13:17.150 "driver_specific": {} 00:13:17.150 } 00:13:17.150 ] 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:17.151 "name": "Existed_Raid", 00:13:17.151 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:17.151 "strip_size_kb": 64, 00:13:17.151 "state": "configuring", 00:13:17.151 "raid_level": "concat", 00:13:17.151 "superblock": true, 00:13:17.151 "num_base_bdevs": 3, 00:13:17.151 "num_base_bdevs_discovered": 2, 00:13:17.151 "num_base_bdevs_operational": 3, 00:13:17.151 "base_bdevs_list": [ 00:13:17.151 { 00:13:17.151 "name": "BaseBdev1", 00:13:17.151 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:17.151 "is_configured": true, 00:13:17.151 "data_offset": 2048, 00:13:17.151 "data_size": 63488 00:13:17.151 }, 00:13:17.151 { 00:13:17.151 "name": "BaseBdev2", 00:13:17.151 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:17.151 "is_configured": true, 00:13:17.151 "data_offset": 2048, 00:13:17.151 "data_size": 63488 00:13:17.151 }, 00:13:17.151 { 00:13:17.151 "name": "BaseBdev3", 00:13:17.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.151 "is_configured": false, 00:13:17.151 "data_offset": 0, 00:13:17.151 "data_size": 0 00:13:17.151 } 00:13:17.151 ] 00:13:17.151 }' 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:17.151 11:49:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:18.090 11:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:18.090 [2024-05-14 11:49:45.063086] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:18.090 [2024-05-14 11:49:45.063253] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2582080 00:13:18.090 [2024-05-14 11:49:45.063268] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:18.090 [2024-05-14 11:49:45.063453] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2581d50 00:13:18.090 [2024-05-14 11:49:45.063581] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2582080 00:13:18.090 [2024-05-14 11:49:45.063592] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2582080 00:13:18.090 [2024-05-14 11:49:45.063685] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:18.090 BaseBdev3 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:18.090 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:18.349 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:18.609 [ 00:13:18.609 { 00:13:18.609 "name": "BaseBdev3", 00:13:18.609 "aliases": [ 00:13:18.609 "3d64ea84-e50d-4add-b396-15a8e44a9bd7" 00:13:18.609 ], 00:13:18.609 "product_name": "Malloc disk", 00:13:18.609 "block_size": 512, 00:13:18.609 "num_blocks": 65536, 00:13:18.609 "uuid": "3d64ea84-e50d-4add-b396-15a8e44a9bd7", 00:13:18.609 "assigned_rate_limits": { 00:13:18.609 "rw_ios_per_sec": 0, 00:13:18.609 "rw_mbytes_per_sec": 0, 00:13:18.609 "r_mbytes_per_sec": 0, 00:13:18.609 "w_mbytes_per_sec": 0 00:13:18.609 }, 00:13:18.609 "claimed": true, 00:13:18.609 "claim_type": "exclusive_write", 00:13:18.609 "zoned": false, 00:13:18.609 "supported_io_types": { 00:13:18.609 "read": true, 00:13:18.609 "write": true, 00:13:18.609 "unmap": true, 00:13:18.609 "write_zeroes": true, 00:13:18.609 "flush": true, 00:13:18.609 "reset": true, 00:13:18.609 "compare": false, 00:13:18.609 "compare_and_write": false, 00:13:18.609 "abort": true, 00:13:18.609 "nvme_admin": false, 00:13:18.609 "nvme_io": false 00:13:18.609 }, 00:13:18.609 "memory_domains": [ 00:13:18.609 { 00:13:18.609 "dma_device_id": "system", 00:13:18.609 "dma_device_type": 1 00:13:18.609 }, 00:13:18.609 { 00:13:18.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.609 "dma_device_type": 2 00:13:18.609 } 00:13:18.609 ], 00:13:18.609 "driver_specific": {} 00:13:18.609 } 00:13:18.609 ] 00:13:18.609 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:18.609 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.610 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.869 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:18.869 "name": "Existed_Raid", 00:13:18.869 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:18.869 "strip_size_kb": 64, 00:13:18.869 "state": "online", 00:13:18.869 "raid_level": "concat", 00:13:18.869 "superblock": true, 00:13:18.869 "num_base_bdevs": 3, 00:13:18.869 "num_base_bdevs_discovered": 3, 00:13:18.869 "num_base_bdevs_operational": 3, 00:13:18.869 "base_bdevs_list": [ 00:13:18.869 { 00:13:18.869 "name": "BaseBdev1", 00:13:18.869 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:18.869 "is_configured": true, 00:13:18.869 "data_offset": 2048, 00:13:18.869 "data_size": 63488 00:13:18.869 }, 00:13:18.869 { 00:13:18.869 "name": "BaseBdev2", 00:13:18.869 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:18.869 "is_configured": true, 00:13:18.869 "data_offset": 2048, 00:13:18.869 "data_size": 63488 00:13:18.869 }, 00:13:18.869 { 00:13:18.869 "name": "BaseBdev3", 00:13:18.869 "uuid": "3d64ea84-e50d-4add-b396-15a8e44a9bd7", 00:13:18.869 "is_configured": true, 00:13:18.869 "data_offset": 2048, 00:13:18.869 "data_size": 63488 00:13:18.869 } 00:13:18.869 ] 00:13:18.869 }' 00:13:18.869 11:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:18.869 11:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:19.438 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:19.698 [2024-05-14 11:49:46.571359] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:19.698 "name": "Existed_Raid", 00:13:19.698 "aliases": [ 00:13:19.698 "a8b018d5-82e7-4c5a-8701-16bbb00f50e7" 00:13:19.698 ], 00:13:19.698 "product_name": "Raid Volume", 00:13:19.698 "block_size": 512, 00:13:19.698 "num_blocks": 190464, 00:13:19.698 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:19.698 "assigned_rate_limits": { 00:13:19.698 "rw_ios_per_sec": 0, 00:13:19.698 "rw_mbytes_per_sec": 0, 00:13:19.698 "r_mbytes_per_sec": 0, 00:13:19.698 "w_mbytes_per_sec": 0 00:13:19.698 }, 00:13:19.698 "claimed": false, 00:13:19.698 "zoned": false, 00:13:19.698 "supported_io_types": { 00:13:19.698 "read": true, 00:13:19.698 "write": true, 00:13:19.698 "unmap": true, 00:13:19.698 "write_zeroes": true, 00:13:19.698 "flush": true, 00:13:19.698 "reset": true, 00:13:19.698 "compare": false, 00:13:19.698 "compare_and_write": false, 00:13:19.698 "abort": false, 00:13:19.698 "nvme_admin": false, 00:13:19.698 "nvme_io": false 00:13:19.698 }, 00:13:19.698 "memory_domains": [ 00:13:19.698 { 00:13:19.698 "dma_device_id": "system", 00:13:19.698 "dma_device_type": 1 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.698 "dma_device_type": 2 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "dma_device_id": "system", 00:13:19.698 "dma_device_type": 1 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.698 "dma_device_type": 2 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "dma_device_id": "system", 00:13:19.698 "dma_device_type": 1 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.698 "dma_device_type": 2 00:13:19.698 } 00:13:19.698 ], 00:13:19.698 "driver_specific": { 00:13:19.698 "raid": { 00:13:19.698 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:19.698 "strip_size_kb": 64, 00:13:19.698 "state": "online", 00:13:19.698 "raid_level": "concat", 00:13:19.698 "superblock": true, 00:13:19.698 "num_base_bdevs": 3, 00:13:19.698 "num_base_bdevs_discovered": 3, 00:13:19.698 "num_base_bdevs_operational": 3, 00:13:19.698 "base_bdevs_list": [ 00:13:19.698 { 00:13:19.698 "name": "BaseBdev1", 00:13:19.698 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:19.698 "is_configured": true, 00:13:19.698 "data_offset": 2048, 00:13:19.698 "data_size": 63488 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "name": "BaseBdev2", 00:13:19.698 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:19.698 "is_configured": true, 00:13:19.698 "data_offset": 2048, 00:13:19.698 "data_size": 63488 00:13:19.698 }, 00:13:19.698 { 00:13:19.698 "name": "BaseBdev3", 00:13:19.698 "uuid": "3d64ea84-e50d-4add-b396-15a8e44a9bd7", 00:13:19.698 "is_configured": true, 00:13:19.698 "data_offset": 2048, 00:13:19.698 "data_size": 63488 00:13:19.698 } 00:13:19.698 ] 00:13:19.698 } 00:13:19.698 } 00:13:19.698 }' 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:13:19.698 BaseBdev2 00:13:19.698 BaseBdev3' 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:19.698 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:19.958 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:19.958 "name": "BaseBdev1", 00:13:19.958 "aliases": [ 00:13:19.958 "7c31222b-a13e-4b10-b843-cf06a6144216" 00:13:19.958 ], 00:13:19.958 "product_name": "Malloc disk", 00:13:19.958 "block_size": 512, 00:13:19.958 "num_blocks": 65536, 00:13:19.958 "uuid": "7c31222b-a13e-4b10-b843-cf06a6144216", 00:13:19.958 "assigned_rate_limits": { 00:13:19.958 "rw_ios_per_sec": 0, 00:13:19.958 "rw_mbytes_per_sec": 0, 00:13:19.958 "r_mbytes_per_sec": 0, 00:13:19.958 "w_mbytes_per_sec": 0 00:13:19.958 }, 00:13:19.958 "claimed": true, 00:13:19.958 "claim_type": "exclusive_write", 00:13:19.959 "zoned": false, 00:13:19.959 "supported_io_types": { 00:13:19.959 "read": true, 00:13:19.959 "write": true, 00:13:19.959 "unmap": true, 00:13:19.959 "write_zeroes": true, 00:13:19.959 "flush": true, 00:13:19.959 "reset": true, 00:13:19.959 "compare": false, 00:13:19.959 "compare_and_write": false, 00:13:19.959 "abort": true, 00:13:19.959 "nvme_admin": false, 00:13:19.959 "nvme_io": false 00:13:19.959 }, 00:13:19.959 "memory_domains": [ 00:13:19.959 { 00:13:19.959 "dma_device_id": "system", 00:13:19.959 "dma_device_type": 1 00:13:19.959 }, 00:13:19.959 { 00:13:19.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.959 "dma_device_type": 2 00:13:19.959 } 00:13:19.959 ], 00:13:19.959 "driver_specific": {} 00:13:19.959 }' 00:13:19.959 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:19.959 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:19.959 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:19.959 11:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:19.959 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:20.218 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:20.478 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:20.478 "name": "BaseBdev2", 00:13:20.478 "aliases": [ 00:13:20.478 "46b32701-ce84-4950-b1aa-2fbd1ed9d539" 00:13:20.478 ], 00:13:20.478 "product_name": "Malloc disk", 00:13:20.478 "block_size": 512, 00:13:20.478 "num_blocks": 65536, 00:13:20.478 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:20.478 "assigned_rate_limits": { 00:13:20.478 "rw_ios_per_sec": 0, 00:13:20.478 "rw_mbytes_per_sec": 0, 00:13:20.478 "r_mbytes_per_sec": 0, 00:13:20.478 "w_mbytes_per_sec": 0 00:13:20.478 }, 00:13:20.478 "claimed": true, 00:13:20.478 "claim_type": "exclusive_write", 00:13:20.478 "zoned": false, 00:13:20.478 "supported_io_types": { 00:13:20.478 "read": true, 00:13:20.478 "write": true, 00:13:20.478 "unmap": true, 00:13:20.478 "write_zeroes": true, 00:13:20.478 "flush": true, 00:13:20.478 "reset": true, 00:13:20.478 "compare": false, 00:13:20.478 "compare_and_write": false, 00:13:20.478 "abort": true, 00:13:20.478 "nvme_admin": false, 00:13:20.478 "nvme_io": false 00:13:20.478 }, 00:13:20.478 "memory_domains": [ 00:13:20.478 { 00:13:20.478 "dma_device_id": "system", 00:13:20.478 "dma_device_type": 1 00:13:20.478 }, 00:13:20.478 { 00:13:20.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.478 "dma_device_type": 2 00:13:20.478 } 00:13:20.478 ], 00:13:20.478 "driver_specific": {} 00:13:20.478 }' 00:13:20.478 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:20.478 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:20.737 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:20.997 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:20.997 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:20.997 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:20.997 11:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:20.997 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:20.997 "name": "BaseBdev3", 00:13:20.997 "aliases": [ 00:13:20.997 "3d64ea84-e50d-4add-b396-15a8e44a9bd7" 00:13:20.997 ], 00:13:20.997 "product_name": "Malloc disk", 00:13:20.997 "block_size": 512, 00:13:20.997 "num_blocks": 65536, 00:13:20.997 "uuid": "3d64ea84-e50d-4add-b396-15a8e44a9bd7", 00:13:20.997 "assigned_rate_limits": { 00:13:20.997 "rw_ios_per_sec": 0, 00:13:20.997 "rw_mbytes_per_sec": 0, 00:13:20.997 "r_mbytes_per_sec": 0, 00:13:20.997 "w_mbytes_per_sec": 0 00:13:20.997 }, 00:13:20.997 "claimed": true, 00:13:20.997 "claim_type": "exclusive_write", 00:13:20.997 "zoned": false, 00:13:20.997 "supported_io_types": { 00:13:20.997 "read": true, 00:13:20.997 "write": true, 00:13:20.997 "unmap": true, 00:13:20.997 "write_zeroes": true, 00:13:20.997 "flush": true, 00:13:20.997 "reset": true, 00:13:20.997 "compare": false, 00:13:20.997 "compare_and_write": false, 00:13:20.997 "abort": true, 00:13:20.997 "nvme_admin": false, 00:13:20.997 "nvme_io": false 00:13:20.997 }, 00:13:20.997 "memory_domains": [ 00:13:20.997 { 00:13:20.997 "dma_device_id": "system", 00:13:20.997 "dma_device_type": 1 00:13:20.997 }, 00:13:20.997 { 00:13:20.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.997 "dma_device_type": 2 00:13:20.997 } 00:13:20.997 ], 00:13:20.997 "driver_specific": {} 00:13:20.997 }' 00:13:20.997 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.257 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:21.516 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:21.516 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:21.516 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:21.776 [2024-05-14 11:49:48.640623] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:21.776 [2024-05-14 11:49:48.640652] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:21.776 [2024-05-14 11:49:48.640697] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.776 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.036 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:22.036 "name": "Existed_Raid", 00:13:22.036 "uuid": "a8b018d5-82e7-4c5a-8701-16bbb00f50e7", 00:13:22.036 "strip_size_kb": 64, 00:13:22.036 "state": "offline", 00:13:22.036 "raid_level": "concat", 00:13:22.036 "superblock": true, 00:13:22.036 "num_base_bdevs": 3, 00:13:22.036 "num_base_bdevs_discovered": 2, 00:13:22.036 "num_base_bdevs_operational": 2, 00:13:22.036 "base_bdevs_list": [ 00:13:22.036 { 00:13:22.036 "name": null, 00:13:22.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.036 "is_configured": false, 00:13:22.036 "data_offset": 2048, 00:13:22.036 "data_size": 63488 00:13:22.036 }, 00:13:22.036 { 00:13:22.036 "name": "BaseBdev2", 00:13:22.036 "uuid": "46b32701-ce84-4950-b1aa-2fbd1ed9d539", 00:13:22.036 "is_configured": true, 00:13:22.036 "data_offset": 2048, 00:13:22.036 "data_size": 63488 00:13:22.036 }, 00:13:22.036 { 00:13:22.036 "name": "BaseBdev3", 00:13:22.036 "uuid": "3d64ea84-e50d-4add-b396-15a8e44a9bd7", 00:13:22.036 "is_configured": true, 00:13:22.036 "data_offset": 2048, 00:13:22.036 "data_size": 63488 00:13:22.036 } 00:13:22.036 ] 00:13:22.036 }' 00:13:22.036 11:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:22.036 11:49:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:22.605 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:13:22.605 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:22.605 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.605 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:22.866 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:22.866 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:22.866 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:23.126 [2024-05-14 11:49:49.962113] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:23.126 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:23.126 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:23.126 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.126 11:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:23.126 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:23.126 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:23.126 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:23.386 [2024-05-14 11:49:50.407845] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:23.386 [2024-05-14 11:49:50.407899] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2582080 name Existed_Raid, state offline 00:13:23.386 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:23.386 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:23.386 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.386 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:23.645 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:23.905 BaseBdev2 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:23.905 11:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:24.164 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:24.422 [ 00:13:24.422 { 00:13:24.422 "name": "BaseBdev2", 00:13:24.422 "aliases": [ 00:13:24.422 "3bb6b928-f291-4c34-93d5-cb0d743e6b9c" 00:13:24.422 ], 00:13:24.422 "product_name": "Malloc disk", 00:13:24.422 "block_size": 512, 00:13:24.422 "num_blocks": 65536, 00:13:24.422 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:24.422 "assigned_rate_limits": { 00:13:24.422 "rw_ios_per_sec": 0, 00:13:24.422 "rw_mbytes_per_sec": 0, 00:13:24.422 "r_mbytes_per_sec": 0, 00:13:24.422 "w_mbytes_per_sec": 0 00:13:24.422 }, 00:13:24.422 "claimed": false, 00:13:24.422 "zoned": false, 00:13:24.422 "supported_io_types": { 00:13:24.422 "read": true, 00:13:24.422 "write": true, 00:13:24.422 "unmap": true, 00:13:24.422 "write_zeroes": true, 00:13:24.422 "flush": true, 00:13:24.422 "reset": true, 00:13:24.422 "compare": false, 00:13:24.422 "compare_and_write": false, 00:13:24.422 "abort": true, 00:13:24.422 "nvme_admin": false, 00:13:24.422 "nvme_io": false 00:13:24.422 }, 00:13:24.422 "memory_domains": [ 00:13:24.422 { 00:13:24.422 "dma_device_id": "system", 00:13:24.422 "dma_device_type": 1 00:13:24.422 }, 00:13:24.422 { 00:13:24.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.422 "dma_device_type": 2 00:13:24.422 } 00:13:24.422 ], 00:13:24.422 "driver_specific": {} 00:13:24.422 } 00:13:24.422 ] 00:13:24.422 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:24.422 11:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:24.422 11:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:24.422 11:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:24.681 BaseBdev3 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:24.681 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:24.941 11:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:25.200 [ 00:13:25.200 { 00:13:25.200 "name": "BaseBdev3", 00:13:25.200 "aliases": [ 00:13:25.200 "194a5d6c-18d4-472f-a604-a0fca47a08fa" 00:13:25.200 ], 00:13:25.200 "product_name": "Malloc disk", 00:13:25.200 "block_size": 512, 00:13:25.200 "num_blocks": 65536, 00:13:25.200 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:25.200 "assigned_rate_limits": { 00:13:25.200 "rw_ios_per_sec": 0, 00:13:25.200 "rw_mbytes_per_sec": 0, 00:13:25.200 "r_mbytes_per_sec": 0, 00:13:25.200 "w_mbytes_per_sec": 0 00:13:25.200 }, 00:13:25.200 "claimed": false, 00:13:25.200 "zoned": false, 00:13:25.200 "supported_io_types": { 00:13:25.200 "read": true, 00:13:25.200 "write": true, 00:13:25.200 "unmap": true, 00:13:25.200 "write_zeroes": true, 00:13:25.200 "flush": true, 00:13:25.200 "reset": true, 00:13:25.200 "compare": false, 00:13:25.200 "compare_and_write": false, 00:13:25.200 "abort": true, 00:13:25.200 "nvme_admin": false, 00:13:25.200 "nvme_io": false 00:13:25.200 }, 00:13:25.200 "memory_domains": [ 00:13:25.200 { 00:13:25.200 "dma_device_id": "system", 00:13:25.200 "dma_device_type": 1 00:13:25.200 }, 00:13:25.200 { 00:13:25.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.200 "dma_device_type": 2 00:13:25.200 } 00:13:25.200 ], 00:13:25.200 "driver_specific": {} 00:13:25.200 } 00:13:25.200 ] 00:13:25.200 11:49:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:25.200 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:25.200 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:25.200 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:25.459 [2024-05-14 11:49:52.385278] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:25.459 [2024-05-14 11:49:52.385319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:25.459 [2024-05-14 11:49:52.385337] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:25.459 [2024-05-14 11:49:52.386875] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.459 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.718 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:25.718 "name": "Existed_Raid", 00:13:25.718 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:25.718 "strip_size_kb": 64, 00:13:25.718 "state": "configuring", 00:13:25.718 "raid_level": "concat", 00:13:25.718 "superblock": true, 00:13:25.718 "num_base_bdevs": 3, 00:13:25.718 "num_base_bdevs_discovered": 2, 00:13:25.718 "num_base_bdevs_operational": 3, 00:13:25.718 "base_bdevs_list": [ 00:13:25.719 { 00:13:25.719 "name": "BaseBdev1", 00:13:25.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.719 "is_configured": false, 00:13:25.719 "data_offset": 0, 00:13:25.719 "data_size": 0 00:13:25.719 }, 00:13:25.719 { 00:13:25.719 "name": "BaseBdev2", 00:13:25.719 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:25.719 "is_configured": true, 00:13:25.719 "data_offset": 2048, 00:13:25.719 "data_size": 63488 00:13:25.719 }, 00:13:25.719 { 00:13:25.719 "name": "BaseBdev3", 00:13:25.719 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:25.719 "is_configured": true, 00:13:25.719 "data_offset": 2048, 00:13:25.719 "data_size": 63488 00:13:25.719 } 00:13:25.719 ] 00:13:25.719 }' 00:13:25.719 11:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:25.719 11:49:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:26.286 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:26.545 [2024-05-14 11:49:53.391898] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.545 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.804 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:26.804 "name": "Existed_Raid", 00:13:26.804 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:26.804 "strip_size_kb": 64, 00:13:26.804 "state": "configuring", 00:13:26.804 "raid_level": "concat", 00:13:26.804 "superblock": true, 00:13:26.804 "num_base_bdevs": 3, 00:13:26.804 "num_base_bdevs_discovered": 1, 00:13:26.804 "num_base_bdevs_operational": 3, 00:13:26.804 "base_bdevs_list": [ 00:13:26.804 { 00:13:26.804 "name": "BaseBdev1", 00:13:26.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.804 "is_configured": false, 00:13:26.804 "data_offset": 0, 00:13:26.804 "data_size": 0 00:13:26.804 }, 00:13:26.804 { 00:13:26.804 "name": null, 00:13:26.804 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:26.804 "is_configured": false, 00:13:26.804 "data_offset": 2048, 00:13:26.804 "data_size": 63488 00:13:26.804 }, 00:13:26.804 { 00:13:26.804 "name": "BaseBdev3", 00:13:26.804 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:26.804 "is_configured": true, 00:13:26.804 "data_offset": 2048, 00:13:26.804 "data_size": 63488 00:13:26.804 } 00:13:26.804 ] 00:13:26.804 }' 00:13:26.804 11:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:26.804 11:49:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:27.378 11:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.378 11:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:27.378 11:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:13:27.378 11:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:27.637 [2024-05-14 11:49:54.651466] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:27.637 BaseBdev1 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:27.637 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.949 11:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:28.218 [ 00:13:28.218 { 00:13:28.218 "name": "BaseBdev1", 00:13:28.218 "aliases": [ 00:13:28.218 "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9" 00:13:28.218 ], 00:13:28.218 "product_name": "Malloc disk", 00:13:28.218 "block_size": 512, 00:13:28.218 "num_blocks": 65536, 00:13:28.218 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:28.218 "assigned_rate_limits": { 00:13:28.218 "rw_ios_per_sec": 0, 00:13:28.218 "rw_mbytes_per_sec": 0, 00:13:28.218 "r_mbytes_per_sec": 0, 00:13:28.218 "w_mbytes_per_sec": 0 00:13:28.218 }, 00:13:28.218 "claimed": true, 00:13:28.218 "claim_type": "exclusive_write", 00:13:28.218 "zoned": false, 00:13:28.218 "supported_io_types": { 00:13:28.218 "read": true, 00:13:28.218 "write": true, 00:13:28.218 "unmap": true, 00:13:28.218 "write_zeroes": true, 00:13:28.218 "flush": true, 00:13:28.218 "reset": true, 00:13:28.218 "compare": false, 00:13:28.218 "compare_and_write": false, 00:13:28.218 "abort": true, 00:13:28.218 "nvme_admin": false, 00:13:28.218 "nvme_io": false 00:13:28.218 }, 00:13:28.218 "memory_domains": [ 00:13:28.218 { 00:13:28.218 "dma_device_id": "system", 00:13:28.218 "dma_device_type": 1 00:13:28.218 }, 00:13:28.218 { 00:13:28.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.218 "dma_device_type": 2 00:13:28.218 } 00:13:28.218 ], 00:13:28.218 "driver_specific": {} 00:13:28.218 } 00:13:28.218 ] 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.218 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.477 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:28.477 "name": "Existed_Raid", 00:13:28.477 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:28.477 "strip_size_kb": 64, 00:13:28.477 "state": "configuring", 00:13:28.477 "raid_level": "concat", 00:13:28.477 "superblock": true, 00:13:28.477 "num_base_bdevs": 3, 00:13:28.477 "num_base_bdevs_discovered": 2, 00:13:28.477 "num_base_bdevs_operational": 3, 00:13:28.477 "base_bdevs_list": [ 00:13:28.477 { 00:13:28.477 "name": "BaseBdev1", 00:13:28.477 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:28.477 "is_configured": true, 00:13:28.477 "data_offset": 2048, 00:13:28.477 "data_size": 63488 00:13:28.477 }, 00:13:28.477 { 00:13:28.477 "name": null, 00:13:28.477 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:28.477 "is_configured": false, 00:13:28.477 "data_offset": 2048, 00:13:28.477 "data_size": 63488 00:13:28.477 }, 00:13:28.477 { 00:13:28.477 "name": "BaseBdev3", 00:13:28.477 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:28.477 "is_configured": true, 00:13:28.477 "data_offset": 2048, 00:13:28.477 "data_size": 63488 00:13:28.477 } 00:13:28.477 ] 00:13:28.477 }' 00:13:28.477 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:28.477 11:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.046 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.046 11:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:29.305 [2024-05-14 11:49:56.311902] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.305 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.564 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:29.564 "name": "Existed_Raid", 00:13:29.564 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:29.564 "strip_size_kb": 64, 00:13:29.564 "state": "configuring", 00:13:29.564 "raid_level": "concat", 00:13:29.564 "superblock": true, 00:13:29.564 "num_base_bdevs": 3, 00:13:29.564 "num_base_bdevs_discovered": 1, 00:13:29.564 "num_base_bdevs_operational": 3, 00:13:29.564 "base_bdevs_list": [ 00:13:29.564 { 00:13:29.565 "name": "BaseBdev1", 00:13:29.565 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:29.565 "is_configured": true, 00:13:29.565 "data_offset": 2048, 00:13:29.565 "data_size": 63488 00:13:29.565 }, 00:13:29.565 { 00:13:29.565 "name": null, 00:13:29.565 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:29.565 "is_configured": false, 00:13:29.565 "data_offset": 2048, 00:13:29.565 "data_size": 63488 00:13:29.565 }, 00:13:29.565 { 00:13:29.565 "name": null, 00:13:29.565 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:29.565 "is_configured": false, 00:13:29.565 "data_offset": 2048, 00:13:29.565 "data_size": 63488 00:13:29.565 } 00:13:29.565 ] 00:13:29.565 }' 00:13:29.565 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:29.565 11:49:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:30.132 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.133 11:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:30.392 [2024-05-14 11:49:57.402814] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.392 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.651 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:30.651 "name": "Existed_Raid", 00:13:30.651 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:30.651 "strip_size_kb": 64, 00:13:30.651 "state": "configuring", 00:13:30.651 "raid_level": "concat", 00:13:30.651 "superblock": true, 00:13:30.651 "num_base_bdevs": 3, 00:13:30.651 "num_base_bdevs_discovered": 2, 00:13:30.651 "num_base_bdevs_operational": 3, 00:13:30.651 "base_bdevs_list": [ 00:13:30.651 { 00:13:30.651 "name": "BaseBdev1", 00:13:30.651 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:30.651 "is_configured": true, 00:13:30.651 "data_offset": 2048, 00:13:30.651 "data_size": 63488 00:13:30.651 }, 00:13:30.651 { 00:13:30.651 "name": null, 00:13:30.651 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:30.651 "is_configured": false, 00:13:30.651 "data_offset": 2048, 00:13:30.651 "data_size": 63488 00:13:30.651 }, 00:13:30.651 { 00:13:30.651 "name": "BaseBdev3", 00:13:30.651 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:30.651 "is_configured": true, 00:13:30.651 "data_offset": 2048, 00:13:30.651 "data_size": 63488 00:13:30.651 } 00:13:30.651 ] 00:13:30.651 }' 00:13:30.651 11:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:30.651 11:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:31.219 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.219 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:31.479 [2024-05-14 11:49:58.513766] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.479 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.739 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:31.739 "name": "Existed_Raid", 00:13:31.739 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:31.739 "strip_size_kb": 64, 00:13:31.739 "state": "configuring", 00:13:31.739 "raid_level": "concat", 00:13:31.739 "superblock": true, 00:13:31.739 "num_base_bdevs": 3, 00:13:31.739 "num_base_bdevs_discovered": 1, 00:13:31.739 "num_base_bdevs_operational": 3, 00:13:31.739 "base_bdevs_list": [ 00:13:31.739 { 00:13:31.739 "name": null, 00:13:31.739 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:31.739 "is_configured": false, 00:13:31.739 "data_offset": 2048, 00:13:31.739 "data_size": 63488 00:13:31.739 }, 00:13:31.739 { 00:13:31.739 "name": null, 00:13:31.739 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:31.739 "is_configured": false, 00:13:31.739 "data_offset": 2048, 00:13:31.739 "data_size": 63488 00:13:31.739 }, 00:13:31.739 { 00:13:31.739 "name": "BaseBdev3", 00:13:31.739 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:31.739 "is_configured": true, 00:13:31.739 "data_offset": 2048, 00:13:31.739 "data_size": 63488 00:13:31.739 } 00:13:31.739 ] 00:13:31.739 }' 00:13:31.739 11:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:31.739 11:49:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:32.676 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.676 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:32.676 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:13:32.677 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:32.936 [2024-05-14 11:49:59.814661] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.936 11:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.936 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:32.936 "name": "Existed_Raid", 00:13:32.936 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:32.936 "strip_size_kb": 64, 00:13:32.936 "state": "configuring", 00:13:32.936 "raid_level": "concat", 00:13:32.936 "superblock": true, 00:13:32.936 "num_base_bdevs": 3, 00:13:32.936 "num_base_bdevs_discovered": 2, 00:13:32.936 "num_base_bdevs_operational": 3, 00:13:32.936 "base_bdevs_list": [ 00:13:32.936 { 00:13:32.936 "name": null, 00:13:32.936 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:32.936 "is_configured": false, 00:13:32.936 "data_offset": 2048, 00:13:32.936 "data_size": 63488 00:13:32.936 }, 00:13:32.936 { 00:13:32.936 "name": "BaseBdev2", 00:13:32.936 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:32.936 "is_configured": true, 00:13:32.936 "data_offset": 2048, 00:13:32.936 "data_size": 63488 00:13:32.936 }, 00:13:32.936 { 00:13:32.936 "name": "BaseBdev3", 00:13:32.936 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:32.936 "is_configured": true, 00:13:32.936 "data_offset": 2048, 00:13:32.936 "data_size": 63488 00:13:32.936 } 00:13:32.936 ] 00:13:32.936 }' 00:13:32.937 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:32.937 11:50:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.505 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.505 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:33.765 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:13:33.765 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.765 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:34.024 11:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a13c74c9-f25f-4c6a-ba7c-c174d64b60c9 00:13:34.284 [2024-05-14 11:50:01.207323] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:34.284 [2024-05-14 11:50:01.207483] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2727230 00:13:34.284 [2024-05-14 11:50:01.207497] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:34.284 [2024-05-14 11:50:01.207673] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x249bdf0 00:13:34.284 [2024-05-14 11:50:01.207796] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2727230 00:13:34.284 [2024-05-14 11:50:01.207806] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2727230 00:13:34.284 [2024-05-14 11:50:01.207897] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.284 NewBaseBdev 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:34.284 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:34.544 [ 00:13:34.544 { 00:13:34.544 "name": "NewBaseBdev", 00:13:34.544 "aliases": [ 00:13:34.544 "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9" 00:13:34.544 ], 00:13:34.544 "product_name": "Malloc disk", 00:13:34.544 "block_size": 512, 00:13:34.544 "num_blocks": 65536, 00:13:34.544 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:34.544 "assigned_rate_limits": { 00:13:34.544 "rw_ios_per_sec": 0, 00:13:34.544 "rw_mbytes_per_sec": 0, 00:13:34.544 "r_mbytes_per_sec": 0, 00:13:34.544 "w_mbytes_per_sec": 0 00:13:34.544 }, 00:13:34.544 "claimed": true, 00:13:34.544 "claim_type": "exclusive_write", 00:13:34.544 "zoned": false, 00:13:34.544 "supported_io_types": { 00:13:34.544 "read": true, 00:13:34.544 "write": true, 00:13:34.544 "unmap": true, 00:13:34.544 "write_zeroes": true, 00:13:34.544 "flush": true, 00:13:34.544 "reset": true, 00:13:34.544 "compare": false, 00:13:34.544 "compare_and_write": false, 00:13:34.544 "abort": true, 00:13:34.544 "nvme_admin": false, 00:13:34.544 "nvme_io": false 00:13:34.544 }, 00:13:34.544 "memory_domains": [ 00:13:34.544 { 00:13:34.544 "dma_device_id": "system", 00:13:34.544 "dma_device_type": 1 00:13:34.544 }, 00:13:34.544 { 00:13:34.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.544 "dma_device_type": 2 00:13:34.544 } 00:13:34.544 ], 00:13:34.544 "driver_specific": {} 00:13:34.544 } 00:13:34.544 ] 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.544 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.805 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:34.805 "name": "Existed_Raid", 00:13:34.805 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:34.805 "strip_size_kb": 64, 00:13:34.805 "state": "online", 00:13:34.805 "raid_level": "concat", 00:13:34.805 "superblock": true, 00:13:34.805 "num_base_bdevs": 3, 00:13:34.805 "num_base_bdevs_discovered": 3, 00:13:34.805 "num_base_bdevs_operational": 3, 00:13:34.805 "base_bdevs_list": [ 00:13:34.805 { 00:13:34.805 "name": "NewBaseBdev", 00:13:34.805 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:34.805 "is_configured": true, 00:13:34.805 "data_offset": 2048, 00:13:34.805 "data_size": 63488 00:13:34.805 }, 00:13:34.805 { 00:13:34.805 "name": "BaseBdev2", 00:13:34.805 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:34.805 "is_configured": true, 00:13:34.805 "data_offset": 2048, 00:13:34.805 "data_size": 63488 00:13:34.805 }, 00:13:34.805 { 00:13:34.805 "name": "BaseBdev3", 00:13:34.805 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:34.805 "is_configured": true, 00:13:34.805 "data_offset": 2048, 00:13:34.805 "data_size": 63488 00:13:34.805 } 00:13:34.805 ] 00:13:34.805 }' 00:13:34.805 11:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:34.805 11:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:35.373 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:35.632 [2024-05-14 11:50:02.547164] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:35.632 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:35.632 "name": "Existed_Raid", 00:13:35.632 "aliases": [ 00:13:35.632 "7afe12c4-81bd-40e1-85eb-a1788b18e1c7" 00:13:35.632 ], 00:13:35.632 "product_name": "Raid Volume", 00:13:35.632 "block_size": 512, 00:13:35.632 "num_blocks": 190464, 00:13:35.632 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:35.632 "assigned_rate_limits": { 00:13:35.632 "rw_ios_per_sec": 0, 00:13:35.632 "rw_mbytes_per_sec": 0, 00:13:35.632 "r_mbytes_per_sec": 0, 00:13:35.632 "w_mbytes_per_sec": 0 00:13:35.632 }, 00:13:35.632 "claimed": false, 00:13:35.632 "zoned": false, 00:13:35.632 "supported_io_types": { 00:13:35.632 "read": true, 00:13:35.632 "write": true, 00:13:35.632 "unmap": true, 00:13:35.632 "write_zeroes": true, 00:13:35.632 "flush": true, 00:13:35.632 "reset": true, 00:13:35.632 "compare": false, 00:13:35.632 "compare_and_write": false, 00:13:35.632 "abort": false, 00:13:35.632 "nvme_admin": false, 00:13:35.632 "nvme_io": false 00:13:35.632 }, 00:13:35.632 "memory_domains": [ 00:13:35.632 { 00:13:35.632 "dma_device_id": "system", 00:13:35.632 "dma_device_type": 1 00:13:35.632 }, 00:13:35.632 { 00:13:35.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.632 "dma_device_type": 2 00:13:35.632 }, 00:13:35.632 { 00:13:35.632 "dma_device_id": "system", 00:13:35.632 "dma_device_type": 1 00:13:35.632 }, 00:13:35.632 { 00:13:35.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.632 "dma_device_type": 2 00:13:35.632 }, 00:13:35.632 { 00:13:35.632 "dma_device_id": "system", 00:13:35.632 "dma_device_type": 1 00:13:35.632 }, 00:13:35.632 { 00:13:35.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.633 "dma_device_type": 2 00:13:35.633 } 00:13:35.633 ], 00:13:35.633 "driver_specific": { 00:13:35.633 "raid": { 00:13:35.633 "uuid": "7afe12c4-81bd-40e1-85eb-a1788b18e1c7", 00:13:35.633 "strip_size_kb": 64, 00:13:35.633 "state": "online", 00:13:35.633 "raid_level": "concat", 00:13:35.633 "superblock": true, 00:13:35.633 "num_base_bdevs": 3, 00:13:35.633 "num_base_bdevs_discovered": 3, 00:13:35.633 "num_base_bdevs_operational": 3, 00:13:35.633 "base_bdevs_list": [ 00:13:35.633 { 00:13:35.633 "name": "NewBaseBdev", 00:13:35.633 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:35.633 "is_configured": true, 00:13:35.633 "data_offset": 2048, 00:13:35.633 "data_size": 63488 00:13:35.633 }, 00:13:35.633 { 00:13:35.633 "name": "BaseBdev2", 00:13:35.633 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:35.633 "is_configured": true, 00:13:35.633 "data_offset": 2048, 00:13:35.633 "data_size": 63488 00:13:35.633 }, 00:13:35.633 { 00:13:35.633 "name": "BaseBdev3", 00:13:35.633 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:35.633 "is_configured": true, 00:13:35.633 "data_offset": 2048, 00:13:35.633 "data_size": 63488 00:13:35.633 } 00:13:35.633 ] 00:13:35.633 } 00:13:35.633 } 00:13:35.633 }' 00:13:35.633 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:35.633 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:13:35.633 BaseBdev2 00:13:35.633 BaseBdev3' 00:13:35.633 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:35.633 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:35.633 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:35.892 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:35.892 "name": "NewBaseBdev", 00:13:35.892 "aliases": [ 00:13:35.892 "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9" 00:13:35.892 ], 00:13:35.892 "product_name": "Malloc disk", 00:13:35.892 "block_size": 512, 00:13:35.892 "num_blocks": 65536, 00:13:35.892 "uuid": "a13c74c9-f25f-4c6a-ba7c-c174d64b60c9", 00:13:35.892 "assigned_rate_limits": { 00:13:35.892 "rw_ios_per_sec": 0, 00:13:35.892 "rw_mbytes_per_sec": 0, 00:13:35.892 "r_mbytes_per_sec": 0, 00:13:35.892 "w_mbytes_per_sec": 0 00:13:35.893 }, 00:13:35.893 "claimed": true, 00:13:35.893 "claim_type": "exclusive_write", 00:13:35.893 "zoned": false, 00:13:35.893 "supported_io_types": { 00:13:35.893 "read": true, 00:13:35.893 "write": true, 00:13:35.893 "unmap": true, 00:13:35.893 "write_zeroes": true, 00:13:35.893 "flush": true, 00:13:35.893 "reset": true, 00:13:35.893 "compare": false, 00:13:35.893 "compare_and_write": false, 00:13:35.893 "abort": true, 00:13:35.893 "nvme_admin": false, 00:13:35.893 "nvme_io": false 00:13:35.893 }, 00:13:35.893 "memory_domains": [ 00:13:35.893 { 00:13:35.893 "dma_device_id": "system", 00:13:35.893 "dma_device_type": 1 00:13:35.893 }, 00:13:35.893 { 00:13:35.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.893 "dma_device_type": 2 00:13:35.893 } 00:13:35.893 ], 00:13:35.893 "driver_specific": {} 00:13:35.893 }' 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:35.893 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:36.152 11:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:36.152 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:36.411 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:36.411 "name": "BaseBdev2", 00:13:36.411 "aliases": [ 00:13:36.411 "3bb6b928-f291-4c34-93d5-cb0d743e6b9c" 00:13:36.411 ], 00:13:36.411 "product_name": "Malloc disk", 00:13:36.411 "block_size": 512, 00:13:36.411 "num_blocks": 65536, 00:13:36.411 "uuid": "3bb6b928-f291-4c34-93d5-cb0d743e6b9c", 00:13:36.411 "assigned_rate_limits": { 00:13:36.411 "rw_ios_per_sec": 0, 00:13:36.411 "rw_mbytes_per_sec": 0, 00:13:36.411 "r_mbytes_per_sec": 0, 00:13:36.411 "w_mbytes_per_sec": 0 00:13:36.411 }, 00:13:36.411 "claimed": true, 00:13:36.411 "claim_type": "exclusive_write", 00:13:36.411 "zoned": false, 00:13:36.411 "supported_io_types": { 00:13:36.411 "read": true, 00:13:36.411 "write": true, 00:13:36.411 "unmap": true, 00:13:36.411 "write_zeroes": true, 00:13:36.411 "flush": true, 00:13:36.411 "reset": true, 00:13:36.411 "compare": false, 00:13:36.411 "compare_and_write": false, 00:13:36.411 "abort": true, 00:13:36.411 "nvme_admin": false, 00:13:36.411 "nvme_io": false 00:13:36.411 }, 00:13:36.411 "memory_domains": [ 00:13:36.411 { 00:13:36.411 "dma_device_id": "system", 00:13:36.411 "dma_device_type": 1 00:13:36.411 }, 00:13:36.411 { 00:13:36.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.411 "dma_device_type": 2 00:13:36.411 } 00:13:36.411 ], 00:13:36.411 "driver_specific": {} 00:13:36.411 }' 00:13:36.411 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:36.411 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:36.411 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:36.411 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:36.670 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:36.929 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:36.929 "name": "BaseBdev3", 00:13:36.929 "aliases": [ 00:13:36.929 "194a5d6c-18d4-472f-a604-a0fca47a08fa" 00:13:36.929 ], 00:13:36.929 "product_name": "Malloc disk", 00:13:36.929 "block_size": 512, 00:13:36.929 "num_blocks": 65536, 00:13:36.929 "uuid": "194a5d6c-18d4-472f-a604-a0fca47a08fa", 00:13:36.929 "assigned_rate_limits": { 00:13:36.929 "rw_ios_per_sec": 0, 00:13:36.929 "rw_mbytes_per_sec": 0, 00:13:36.929 "r_mbytes_per_sec": 0, 00:13:36.929 "w_mbytes_per_sec": 0 00:13:36.929 }, 00:13:36.929 "claimed": true, 00:13:36.929 "claim_type": "exclusive_write", 00:13:36.929 "zoned": false, 00:13:36.929 "supported_io_types": { 00:13:36.929 "read": true, 00:13:36.929 "write": true, 00:13:36.929 "unmap": true, 00:13:36.929 "write_zeroes": true, 00:13:36.929 "flush": true, 00:13:36.929 "reset": true, 00:13:36.929 "compare": false, 00:13:36.929 "compare_and_write": false, 00:13:36.929 "abort": true, 00:13:36.929 "nvme_admin": false, 00:13:36.929 "nvme_io": false 00:13:36.929 }, 00:13:36.929 "memory_domains": [ 00:13:36.929 { 00:13:36.929 "dma_device_id": "system", 00:13:36.929 "dma_device_type": 1 00:13:36.929 }, 00:13:36.930 { 00:13:36.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.930 "dma_device_type": 2 00:13:36.930 } 00:13:36.930 ], 00:13:36.930 "driver_specific": {} 00:13:36.930 }' 00:13:36.930 11:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:36.930 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:37.189 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:37.448 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:37.448 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:37.448 [2024-05-14 11:50:04.528153] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:37.448 [2024-05-14 11:50:04.528178] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:37.448 [2024-05-14 11:50:04.528227] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:37.448 [2024-05-14 11:50:04.528280] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:37.448 [2024-05-14 11:50:04.528292] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2727230 name Existed_Raid, state offline 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1693736 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1693736 ']' 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1693736 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1693736 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1693736' 00:13:37.708 killing process with pid 1693736 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1693736 00:13:37.708 [2024-05-14 11:50:04.599060] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:37.708 11:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1693736 00:13:37.708 [2024-05-14 11:50:04.662857] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.968 11:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:13:37.968 00:13:37.968 real 0m26.493s 00:13:37.968 user 0m48.353s 00:13:37.968 sys 0m4.771s 00:13:37.968 11:50:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:37.968 11:50:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:37.968 ************************************ 00:13:37.968 END TEST raid_state_function_test_sb 00:13:37.968 ************************************ 00:13:38.228 11:50:05 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:13:38.228 11:50:05 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:13:38.228 11:50:05 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:38.228 11:50:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:38.228 ************************************ 00:13:38.228 START TEST raid_superblock_test 00:13:38.228 ************************************ 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 3 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1697849 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1697849 /var/tmp/spdk-raid.sock 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1697849 ']' 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:38.228 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:38.228 11:50:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.228 [2024-05-14 11:50:05.186764] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:13:38.228 [2024-05-14 11:50:05.186813] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1697849 ] 00:13:38.228 [2024-05-14 11:50:05.301705] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:38.487 [2024-05-14 11:50:05.411679] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:38.487 [2024-05-14 11:50:05.471238] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.487 [2024-05-14 11:50:05.471273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:39.057 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:39.316 malloc1 00:13:39.316 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:39.576 [2024-05-14 11:50:06.455167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:39.576 [2024-05-14 11:50:06.455214] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.576 [2024-05-14 11:50:06.455234] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20712a0 00:13:39.576 [2024-05-14 11:50:06.455247] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.576 [2024-05-14 11:50:06.456809] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.576 [2024-05-14 11:50:06.456836] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:39.576 pt1 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:39.576 malloc2 00:13:39.576 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:39.836 [2024-05-14 11:50:06.812855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:39.836 [2024-05-14 11:50:06.812897] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.836 [2024-05-14 11:50:06.812917] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2224480 00:13:39.836 [2024-05-14 11:50:06.812931] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.836 [2024-05-14 11:50:06.814287] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.836 [2024-05-14 11:50:06.814313] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:39.836 pt2 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:39.836 11:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:40.096 malloc3 00:13:40.096 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:40.096 [2024-05-14 11:50:07.162473] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:40.096 [2024-05-14 11:50:07.162518] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.096 [2024-05-14 11:50:07.162538] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x206ae80 00:13:40.096 [2024-05-14 11:50:07.162550] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.096 [2024-05-14 11:50:07.163918] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.096 [2024-05-14 11:50:07.163945] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:40.096 pt3 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:40.356 [2024-05-14 11:50:07.330945] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:40.356 [2024-05-14 11:50:07.332167] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:40.356 [2024-05-14 11:50:07.332225] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:40.356 [2024-05-14 11:50:07.332375] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x206cdc0 00:13:40.356 [2024-05-14 11:50:07.332386] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:40.356 [2024-05-14 11:50:07.332579] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2071900 00:13:40.356 [2024-05-14 11:50:07.332718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x206cdc0 00:13:40.356 [2024-05-14 11:50:07.332728] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x206cdc0 00:13:40.356 [2024-05-14 11:50:07.332819] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.356 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.615 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:40.615 "name": "raid_bdev1", 00:13:40.615 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:40.615 "strip_size_kb": 64, 00:13:40.615 "state": "online", 00:13:40.615 "raid_level": "concat", 00:13:40.615 "superblock": true, 00:13:40.615 "num_base_bdevs": 3, 00:13:40.615 "num_base_bdevs_discovered": 3, 00:13:40.615 "num_base_bdevs_operational": 3, 00:13:40.615 "base_bdevs_list": [ 00:13:40.615 { 00:13:40.615 "name": "pt1", 00:13:40.615 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:40.615 "is_configured": true, 00:13:40.615 "data_offset": 2048, 00:13:40.615 "data_size": 63488 00:13:40.615 }, 00:13:40.615 { 00:13:40.615 "name": "pt2", 00:13:40.615 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:40.615 "is_configured": true, 00:13:40.615 "data_offset": 2048, 00:13:40.615 "data_size": 63488 00:13:40.615 }, 00:13:40.615 { 00:13:40.615 "name": "pt3", 00:13:40.615 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:40.615 "is_configured": true, 00:13:40.615 "data_offset": 2048, 00:13:40.615 "data_size": 63488 00:13:40.615 } 00:13:40.615 ] 00:13:40.615 }' 00:13:40.615 11:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:40.615 11:50:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:41.183 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:41.183 [2024-05-14 11:50:08.253618] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:41.443 "name": "raid_bdev1", 00:13:41.443 "aliases": [ 00:13:41.443 "241658cd-0150-4b50-8fd9-b260c26dfc67" 00:13:41.443 ], 00:13:41.443 "product_name": "Raid Volume", 00:13:41.443 "block_size": 512, 00:13:41.443 "num_blocks": 190464, 00:13:41.443 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:41.443 "assigned_rate_limits": { 00:13:41.443 "rw_ios_per_sec": 0, 00:13:41.443 "rw_mbytes_per_sec": 0, 00:13:41.443 "r_mbytes_per_sec": 0, 00:13:41.443 "w_mbytes_per_sec": 0 00:13:41.443 }, 00:13:41.443 "claimed": false, 00:13:41.443 "zoned": false, 00:13:41.443 "supported_io_types": { 00:13:41.443 "read": true, 00:13:41.443 "write": true, 00:13:41.443 "unmap": true, 00:13:41.443 "write_zeroes": true, 00:13:41.443 "flush": true, 00:13:41.443 "reset": true, 00:13:41.443 "compare": false, 00:13:41.443 "compare_and_write": false, 00:13:41.443 "abort": false, 00:13:41.443 "nvme_admin": false, 00:13:41.443 "nvme_io": false 00:13:41.443 }, 00:13:41.443 "memory_domains": [ 00:13:41.443 { 00:13:41.443 "dma_device_id": "system", 00:13:41.443 "dma_device_type": 1 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.443 "dma_device_type": 2 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "dma_device_id": "system", 00:13:41.443 "dma_device_type": 1 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.443 "dma_device_type": 2 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "dma_device_id": "system", 00:13:41.443 "dma_device_type": 1 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.443 "dma_device_type": 2 00:13:41.443 } 00:13:41.443 ], 00:13:41.443 "driver_specific": { 00:13:41.443 "raid": { 00:13:41.443 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:41.443 "strip_size_kb": 64, 00:13:41.443 "state": "online", 00:13:41.443 "raid_level": "concat", 00:13:41.443 "superblock": true, 00:13:41.443 "num_base_bdevs": 3, 00:13:41.443 "num_base_bdevs_discovered": 3, 00:13:41.443 "num_base_bdevs_operational": 3, 00:13:41.443 "base_bdevs_list": [ 00:13:41.443 { 00:13:41.443 "name": "pt1", 00:13:41.443 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:41.443 "is_configured": true, 00:13:41.443 "data_offset": 2048, 00:13:41.443 "data_size": 63488 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "name": "pt2", 00:13:41.443 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:41.443 "is_configured": true, 00:13:41.443 "data_offset": 2048, 00:13:41.443 "data_size": 63488 00:13:41.443 }, 00:13:41.443 { 00:13:41.443 "name": "pt3", 00:13:41.443 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:41.443 "is_configured": true, 00:13:41.443 "data_offset": 2048, 00:13:41.443 "data_size": 63488 00:13:41.443 } 00:13:41.443 ] 00:13:41.443 } 00:13:41.443 } 00:13:41.443 }' 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:41.443 pt2 00:13:41.443 pt3' 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:41.443 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:41.703 "name": "pt1", 00:13:41.703 "aliases": [ 00:13:41.703 "f29ec9c5-3b36-51ed-85e5-022779bf2132" 00:13:41.703 ], 00:13:41.703 "product_name": "passthru", 00:13:41.703 "block_size": 512, 00:13:41.703 "num_blocks": 65536, 00:13:41.703 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:41.703 "assigned_rate_limits": { 00:13:41.703 "rw_ios_per_sec": 0, 00:13:41.703 "rw_mbytes_per_sec": 0, 00:13:41.703 "r_mbytes_per_sec": 0, 00:13:41.703 "w_mbytes_per_sec": 0 00:13:41.703 }, 00:13:41.703 "claimed": true, 00:13:41.703 "claim_type": "exclusive_write", 00:13:41.703 "zoned": false, 00:13:41.703 "supported_io_types": { 00:13:41.703 "read": true, 00:13:41.703 "write": true, 00:13:41.703 "unmap": true, 00:13:41.703 "write_zeroes": true, 00:13:41.703 "flush": true, 00:13:41.703 "reset": true, 00:13:41.703 "compare": false, 00:13:41.703 "compare_and_write": false, 00:13:41.703 "abort": true, 00:13:41.703 "nvme_admin": false, 00:13:41.703 "nvme_io": false 00:13:41.703 }, 00:13:41.703 "memory_domains": [ 00:13:41.703 { 00:13:41.703 "dma_device_id": "system", 00:13:41.703 "dma_device_type": 1 00:13:41.703 }, 00:13:41.703 { 00:13:41.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.703 "dma_device_type": 2 00:13:41.703 } 00:13:41.703 ], 00:13:41.703 "driver_specific": { 00:13:41.703 "passthru": { 00:13:41.703 "name": "pt1", 00:13:41.703 "base_bdev_name": "malloc1" 00:13:41.703 } 00:13:41.703 } 00:13:41.703 }' 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.703 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:41.963 11:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:42.223 "name": "pt2", 00:13:42.223 "aliases": [ 00:13:42.223 "d12d6697-067d-5f8a-8b52-96bd289cdab3" 00:13:42.223 ], 00:13:42.223 "product_name": "passthru", 00:13:42.223 "block_size": 512, 00:13:42.223 "num_blocks": 65536, 00:13:42.223 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:42.223 "assigned_rate_limits": { 00:13:42.223 "rw_ios_per_sec": 0, 00:13:42.223 "rw_mbytes_per_sec": 0, 00:13:42.223 "r_mbytes_per_sec": 0, 00:13:42.223 "w_mbytes_per_sec": 0 00:13:42.223 }, 00:13:42.223 "claimed": true, 00:13:42.223 "claim_type": "exclusive_write", 00:13:42.223 "zoned": false, 00:13:42.223 "supported_io_types": { 00:13:42.223 "read": true, 00:13:42.223 "write": true, 00:13:42.223 "unmap": true, 00:13:42.223 "write_zeroes": true, 00:13:42.223 "flush": true, 00:13:42.223 "reset": true, 00:13:42.223 "compare": false, 00:13:42.223 "compare_and_write": false, 00:13:42.223 "abort": true, 00:13:42.223 "nvme_admin": false, 00:13:42.223 "nvme_io": false 00:13:42.223 }, 00:13:42.223 "memory_domains": [ 00:13:42.223 { 00:13:42.223 "dma_device_id": "system", 00:13:42.223 "dma_device_type": 1 00:13:42.223 }, 00:13:42.223 { 00:13:42.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.223 "dma_device_type": 2 00:13:42.223 } 00:13:42.223 ], 00:13:42.223 "driver_specific": { 00:13:42.223 "passthru": { 00:13:42.223 "name": "pt2", 00:13:42.223 "base_bdev_name": "malloc2" 00:13:42.223 } 00:13:42.223 } 00:13:42.223 }' 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:42.223 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:42.486 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:42.795 "name": "pt3", 00:13:42.795 "aliases": [ 00:13:42.795 "00fb8873-90cd-5131-8a40-fb306397e398" 00:13:42.795 ], 00:13:42.795 "product_name": "passthru", 00:13:42.795 "block_size": 512, 00:13:42.795 "num_blocks": 65536, 00:13:42.795 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:42.795 "assigned_rate_limits": { 00:13:42.795 "rw_ios_per_sec": 0, 00:13:42.795 "rw_mbytes_per_sec": 0, 00:13:42.795 "r_mbytes_per_sec": 0, 00:13:42.795 "w_mbytes_per_sec": 0 00:13:42.795 }, 00:13:42.795 "claimed": true, 00:13:42.795 "claim_type": "exclusive_write", 00:13:42.795 "zoned": false, 00:13:42.795 "supported_io_types": { 00:13:42.795 "read": true, 00:13:42.795 "write": true, 00:13:42.795 "unmap": true, 00:13:42.795 "write_zeroes": true, 00:13:42.795 "flush": true, 00:13:42.795 "reset": true, 00:13:42.795 "compare": false, 00:13:42.795 "compare_and_write": false, 00:13:42.795 "abort": true, 00:13:42.795 "nvme_admin": false, 00:13:42.795 "nvme_io": false 00:13:42.795 }, 00:13:42.795 "memory_domains": [ 00:13:42.795 { 00:13:42.795 "dma_device_id": "system", 00:13:42.795 "dma_device_type": 1 00:13:42.795 }, 00:13:42.795 { 00:13:42.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.795 "dma_device_type": 2 00:13:42.795 } 00:13:42.795 ], 00:13:42.795 "driver_specific": { 00:13:42.795 "passthru": { 00:13:42.795 "name": "pt3", 00:13:42.795 "base_bdev_name": "malloc3" 00:13:42.795 } 00:13:42.795 } 00:13:42.795 }' 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:42.795 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:43.053 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:43.053 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:43.053 11:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:43.053 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:13:43.312 [2024-05-14 11:50:10.262992] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:43.312 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=241658cd-0150-4b50-8fd9-b260c26dfc67 00:13:43.312 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 241658cd-0150-4b50-8fd9-b260c26dfc67 ']' 00:13:43.312 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:43.571 [2024-05-14 11:50:10.507357] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.571 [2024-05-14 11:50:10.507380] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.572 [2024-05-14 11:50:10.507441] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.572 [2024-05-14 11:50:10.507500] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.572 [2024-05-14 11:50:10.507513] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206cdc0 name raid_bdev1, state offline 00:13:43.572 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.572 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:13:43.831 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:13:43.831 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:13:43.831 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:43.831 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:44.090 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:44.090 11:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:44.350 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:44.350 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:44.609 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:44.609 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:44.869 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:45.129 [2024-05-14 11:50:11.971169] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:45.129 [2024-05-14 11:50:11.972526] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:45.129 [2024-05-14 11:50:11.972570] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:45.129 [2024-05-14 11:50:11.972616] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:45.129 [2024-05-14 11:50:11.972655] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:45.129 [2024-05-14 11:50:11.972678] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:45.129 [2024-05-14 11:50:11.972695] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:45.129 [2024-05-14 11:50:11.972705] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206bff0 name raid_bdev1, state configuring 00:13:45.129 request: 00:13:45.129 { 00:13:45.129 "name": "raid_bdev1", 00:13:45.129 "raid_level": "concat", 00:13:45.129 "base_bdevs": [ 00:13:45.129 "malloc1", 00:13:45.129 "malloc2", 00:13:45.129 "malloc3" 00:13:45.129 ], 00:13:45.129 "superblock": false, 00:13:45.129 "strip_size_kb": 64, 00:13:45.129 "method": "bdev_raid_create", 00:13:45.129 "req_id": 1 00:13:45.129 } 00:13:45.129 Got JSON-RPC error response 00:13:45.129 response: 00:13:45.129 { 00:13:45.129 "code": -17, 00:13:45.129 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:45.129 } 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.129 11:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:45.389 [2024-05-14 11:50:12.448364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:45.389 [2024-05-14 11:50:12.448417] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:45.389 [2024-05-14 11:50:12.448442] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221a060 00:13:45.389 [2024-05-14 11:50:12.448455] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:45.389 [2024-05-14 11:50:12.450129] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:45.389 [2024-05-14 11:50:12.450158] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:45.389 [2024-05-14 11:50:12.450232] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:13:45.389 [2024-05-14 11:50:12.450260] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:45.389 pt1 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.389 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:45.648 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:45.648 "name": "raid_bdev1", 00:13:45.648 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:45.648 "strip_size_kb": 64, 00:13:45.648 "state": "configuring", 00:13:45.648 "raid_level": "concat", 00:13:45.648 "superblock": true, 00:13:45.648 "num_base_bdevs": 3, 00:13:45.648 "num_base_bdevs_discovered": 1, 00:13:45.648 "num_base_bdevs_operational": 3, 00:13:45.648 "base_bdevs_list": [ 00:13:45.648 { 00:13:45.648 "name": "pt1", 00:13:45.648 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:45.648 "is_configured": true, 00:13:45.648 "data_offset": 2048, 00:13:45.648 "data_size": 63488 00:13:45.648 }, 00:13:45.648 { 00:13:45.648 "name": null, 00:13:45.648 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:45.648 "is_configured": false, 00:13:45.648 "data_offset": 2048, 00:13:45.648 "data_size": 63488 00:13:45.648 }, 00:13:45.648 { 00:13:45.648 "name": null, 00:13:45.648 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:45.648 "is_configured": false, 00:13:45.648 "data_offset": 2048, 00:13:45.648 "data_size": 63488 00:13:45.648 } 00:13:45.648 ] 00:13:45.648 }' 00:13:45.648 11:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:45.648 11:50:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.217 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:13:46.217 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:46.476 [2024-05-14 11:50:13.519208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:46.476 [2024-05-14 11:50:13.519262] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.476 [2024-05-14 11:50:13.519283] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2072020 00:13:46.476 [2024-05-14 11:50:13.519296] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.476 [2024-05-14 11:50:13.519649] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.476 [2024-05-14 11:50:13.519668] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:46.476 [2024-05-14 11:50:13.519731] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:13:46.476 [2024-05-14 11:50:13.519751] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:46.476 pt2 00:13:46.476 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:46.735 [2024-05-14 11:50:13.759853] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.735 11:50:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:46.994 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:46.994 "name": "raid_bdev1", 00:13:46.994 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:46.994 "strip_size_kb": 64, 00:13:46.994 "state": "configuring", 00:13:46.994 "raid_level": "concat", 00:13:46.994 "superblock": true, 00:13:46.994 "num_base_bdevs": 3, 00:13:46.994 "num_base_bdevs_discovered": 1, 00:13:46.994 "num_base_bdevs_operational": 3, 00:13:46.994 "base_bdevs_list": [ 00:13:46.994 { 00:13:46.994 "name": "pt1", 00:13:46.994 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:46.994 "is_configured": true, 00:13:46.994 "data_offset": 2048, 00:13:46.994 "data_size": 63488 00:13:46.994 }, 00:13:46.994 { 00:13:46.994 "name": null, 00:13:46.994 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:46.994 "is_configured": false, 00:13:46.994 "data_offset": 2048, 00:13:46.994 "data_size": 63488 00:13:46.994 }, 00:13:46.994 { 00:13:46.994 "name": null, 00:13:46.994 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:46.994 "is_configured": false, 00:13:46.994 "data_offset": 2048, 00:13:46.994 "data_size": 63488 00:13:46.994 } 00:13:46.994 ] 00:13:46.994 }' 00:13:46.994 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:46.994 11:50:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.561 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:13:47.561 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:47.561 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:47.820 [2024-05-14 11:50:14.850732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:47.820 [2024-05-14 11:50:14.850784] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:47.820 [2024-05-14 11:50:14.850804] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2069530 00:13:47.820 [2024-05-14 11:50:14.850817] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:47.820 [2024-05-14 11:50:14.851161] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:47.820 [2024-05-14 11:50:14.851178] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:47.820 [2024-05-14 11:50:14.851245] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:13:47.820 [2024-05-14 11:50:14.851264] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:47.820 pt2 00:13:47.820 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:47.820 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:47.821 11:50:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:48.080 [2024-05-14 11:50:15.095374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:48.080 [2024-05-14 11:50:15.095422] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:48.080 [2024-05-14 11:50:15.095443] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20698c0 00:13:48.080 [2024-05-14 11:50:15.095455] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:48.080 [2024-05-14 11:50:15.095761] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:48.080 [2024-05-14 11:50:15.095778] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:48.080 [2024-05-14 11:50:15.095833] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:13:48.080 [2024-05-14 11:50:15.095851] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:48.080 [2024-05-14 11:50:15.095960] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x206d7b0 00:13:48.080 [2024-05-14 11:50:15.095970] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:48.080 [2024-05-14 11:50:15.096133] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2070b00 00:13:48.080 [2024-05-14 11:50:15.096258] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x206d7b0 00:13:48.080 [2024-05-14 11:50:15.096267] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x206d7b0 00:13:48.080 [2024-05-14 11:50:15.096362] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:48.080 pt3 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.080 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:48.339 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:48.339 "name": "raid_bdev1", 00:13:48.339 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:48.339 "strip_size_kb": 64, 00:13:48.339 "state": "online", 00:13:48.339 "raid_level": "concat", 00:13:48.339 "superblock": true, 00:13:48.339 "num_base_bdevs": 3, 00:13:48.339 "num_base_bdevs_discovered": 3, 00:13:48.339 "num_base_bdevs_operational": 3, 00:13:48.339 "base_bdevs_list": [ 00:13:48.339 { 00:13:48.339 "name": "pt1", 00:13:48.339 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:48.339 "is_configured": true, 00:13:48.339 "data_offset": 2048, 00:13:48.339 "data_size": 63488 00:13:48.339 }, 00:13:48.339 { 00:13:48.339 "name": "pt2", 00:13:48.339 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:48.339 "is_configured": true, 00:13:48.339 "data_offset": 2048, 00:13:48.339 "data_size": 63488 00:13:48.339 }, 00:13:48.339 { 00:13:48.339 "name": "pt3", 00:13:48.339 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:48.339 "is_configured": true, 00:13:48.339 "data_offset": 2048, 00:13:48.339 "data_size": 63488 00:13:48.339 } 00:13:48.339 ] 00:13:48.339 }' 00:13:48.339 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:48.339 11:50:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:48.907 11:50:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:49.166 [2024-05-14 11:50:16.170477] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:49.166 "name": "raid_bdev1", 00:13:49.166 "aliases": [ 00:13:49.166 "241658cd-0150-4b50-8fd9-b260c26dfc67" 00:13:49.166 ], 00:13:49.166 "product_name": "Raid Volume", 00:13:49.166 "block_size": 512, 00:13:49.166 "num_blocks": 190464, 00:13:49.166 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:49.166 "assigned_rate_limits": { 00:13:49.166 "rw_ios_per_sec": 0, 00:13:49.166 "rw_mbytes_per_sec": 0, 00:13:49.166 "r_mbytes_per_sec": 0, 00:13:49.166 "w_mbytes_per_sec": 0 00:13:49.166 }, 00:13:49.166 "claimed": false, 00:13:49.166 "zoned": false, 00:13:49.166 "supported_io_types": { 00:13:49.166 "read": true, 00:13:49.166 "write": true, 00:13:49.166 "unmap": true, 00:13:49.166 "write_zeroes": true, 00:13:49.166 "flush": true, 00:13:49.166 "reset": true, 00:13:49.166 "compare": false, 00:13:49.166 "compare_and_write": false, 00:13:49.166 "abort": false, 00:13:49.166 "nvme_admin": false, 00:13:49.166 "nvme_io": false 00:13:49.166 }, 00:13:49.166 "memory_domains": [ 00:13:49.166 { 00:13:49.166 "dma_device_id": "system", 00:13:49.166 "dma_device_type": 1 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.166 "dma_device_type": 2 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "dma_device_id": "system", 00:13:49.166 "dma_device_type": 1 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.166 "dma_device_type": 2 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "dma_device_id": "system", 00:13:49.166 "dma_device_type": 1 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.166 "dma_device_type": 2 00:13:49.166 } 00:13:49.166 ], 00:13:49.166 "driver_specific": { 00:13:49.166 "raid": { 00:13:49.166 "uuid": "241658cd-0150-4b50-8fd9-b260c26dfc67", 00:13:49.166 "strip_size_kb": 64, 00:13:49.166 "state": "online", 00:13:49.166 "raid_level": "concat", 00:13:49.166 "superblock": true, 00:13:49.166 "num_base_bdevs": 3, 00:13:49.166 "num_base_bdevs_discovered": 3, 00:13:49.166 "num_base_bdevs_operational": 3, 00:13:49.166 "base_bdevs_list": [ 00:13:49.166 { 00:13:49.166 "name": "pt1", 00:13:49.166 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:49.166 "is_configured": true, 00:13:49.166 "data_offset": 2048, 00:13:49.166 "data_size": 63488 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "name": "pt2", 00:13:49.166 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:49.166 "is_configured": true, 00:13:49.166 "data_offset": 2048, 00:13:49.166 "data_size": 63488 00:13:49.166 }, 00:13:49.166 { 00:13:49.166 "name": "pt3", 00:13:49.166 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:49.166 "is_configured": true, 00:13:49.166 "data_offset": 2048, 00:13:49.166 "data_size": 63488 00:13:49.166 } 00:13:49.166 ] 00:13:49.166 } 00:13:49.166 } 00:13:49.166 }' 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:49.166 pt2 00:13:49.166 pt3' 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:49.166 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:49.425 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:49.425 "name": "pt1", 00:13:49.425 "aliases": [ 00:13:49.425 "f29ec9c5-3b36-51ed-85e5-022779bf2132" 00:13:49.425 ], 00:13:49.425 "product_name": "passthru", 00:13:49.425 "block_size": 512, 00:13:49.425 "num_blocks": 65536, 00:13:49.425 "uuid": "f29ec9c5-3b36-51ed-85e5-022779bf2132", 00:13:49.425 "assigned_rate_limits": { 00:13:49.425 "rw_ios_per_sec": 0, 00:13:49.425 "rw_mbytes_per_sec": 0, 00:13:49.425 "r_mbytes_per_sec": 0, 00:13:49.425 "w_mbytes_per_sec": 0 00:13:49.425 }, 00:13:49.425 "claimed": true, 00:13:49.425 "claim_type": "exclusive_write", 00:13:49.425 "zoned": false, 00:13:49.425 "supported_io_types": { 00:13:49.425 "read": true, 00:13:49.425 "write": true, 00:13:49.425 "unmap": true, 00:13:49.425 "write_zeroes": true, 00:13:49.425 "flush": true, 00:13:49.425 "reset": true, 00:13:49.425 "compare": false, 00:13:49.425 "compare_and_write": false, 00:13:49.425 "abort": true, 00:13:49.425 "nvme_admin": false, 00:13:49.425 "nvme_io": false 00:13:49.425 }, 00:13:49.425 "memory_domains": [ 00:13:49.425 { 00:13:49.425 "dma_device_id": "system", 00:13:49.425 "dma_device_type": 1 00:13:49.425 }, 00:13:49.426 { 00:13:49.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.426 "dma_device_type": 2 00:13:49.426 } 00:13:49.426 ], 00:13:49.426 "driver_specific": { 00:13:49.426 "passthru": { 00:13:49.426 "name": "pt1", 00:13:49.426 "base_bdev_name": "malloc1" 00:13:49.426 } 00:13:49.426 } 00:13:49.426 }' 00:13:49.426 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.685 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:49.944 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:49.944 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:49.944 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:49.944 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:49.944 11:50:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:50.203 "name": "pt2", 00:13:50.203 "aliases": [ 00:13:50.203 "d12d6697-067d-5f8a-8b52-96bd289cdab3" 00:13:50.203 ], 00:13:50.203 "product_name": "passthru", 00:13:50.203 "block_size": 512, 00:13:50.203 "num_blocks": 65536, 00:13:50.203 "uuid": "d12d6697-067d-5f8a-8b52-96bd289cdab3", 00:13:50.203 "assigned_rate_limits": { 00:13:50.203 "rw_ios_per_sec": 0, 00:13:50.203 "rw_mbytes_per_sec": 0, 00:13:50.203 "r_mbytes_per_sec": 0, 00:13:50.203 "w_mbytes_per_sec": 0 00:13:50.203 }, 00:13:50.203 "claimed": true, 00:13:50.203 "claim_type": "exclusive_write", 00:13:50.203 "zoned": false, 00:13:50.203 "supported_io_types": { 00:13:50.203 "read": true, 00:13:50.203 "write": true, 00:13:50.203 "unmap": true, 00:13:50.203 "write_zeroes": true, 00:13:50.203 "flush": true, 00:13:50.203 "reset": true, 00:13:50.203 "compare": false, 00:13:50.203 "compare_and_write": false, 00:13:50.203 "abort": true, 00:13:50.203 "nvme_admin": false, 00:13:50.203 "nvme_io": false 00:13:50.203 }, 00:13:50.203 "memory_domains": [ 00:13:50.203 { 00:13:50.203 "dma_device_id": "system", 00:13:50.203 "dma_device_type": 1 00:13:50.203 }, 00:13:50.203 { 00:13:50.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.203 "dma_device_type": 2 00:13:50.203 } 00:13:50.203 ], 00:13:50.203 "driver_specific": { 00:13:50.203 "passthru": { 00:13:50.203 "name": "pt2", 00:13:50.203 "base_bdev_name": "malloc2" 00:13:50.203 } 00:13:50.203 } 00:13:50.203 }' 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.203 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:50.463 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:50.722 "name": "pt3", 00:13:50.722 "aliases": [ 00:13:50.722 "00fb8873-90cd-5131-8a40-fb306397e398" 00:13:50.722 ], 00:13:50.722 "product_name": "passthru", 00:13:50.722 "block_size": 512, 00:13:50.722 "num_blocks": 65536, 00:13:50.722 "uuid": "00fb8873-90cd-5131-8a40-fb306397e398", 00:13:50.722 "assigned_rate_limits": { 00:13:50.722 "rw_ios_per_sec": 0, 00:13:50.722 "rw_mbytes_per_sec": 0, 00:13:50.722 "r_mbytes_per_sec": 0, 00:13:50.722 "w_mbytes_per_sec": 0 00:13:50.722 }, 00:13:50.722 "claimed": true, 00:13:50.722 "claim_type": "exclusive_write", 00:13:50.722 "zoned": false, 00:13:50.722 "supported_io_types": { 00:13:50.722 "read": true, 00:13:50.722 "write": true, 00:13:50.722 "unmap": true, 00:13:50.722 "write_zeroes": true, 00:13:50.722 "flush": true, 00:13:50.722 "reset": true, 00:13:50.722 "compare": false, 00:13:50.722 "compare_and_write": false, 00:13:50.722 "abort": true, 00:13:50.722 "nvme_admin": false, 00:13:50.722 "nvme_io": false 00:13:50.722 }, 00:13:50.722 "memory_domains": [ 00:13:50.722 { 00:13:50.722 "dma_device_id": "system", 00:13:50.722 "dma_device_type": 1 00:13:50.722 }, 00:13:50.722 { 00:13:50.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.722 "dma_device_type": 2 00:13:50.722 } 00:13:50.722 ], 00:13:50.722 "driver_specific": { 00:13:50.722 "passthru": { 00:13:50.722 "name": "pt3", 00:13:50.722 "base_bdev_name": "malloc3" 00:13:50.722 } 00:13:50.722 } 00:13:50.722 }' 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.722 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:50.981 11:50:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:13:51.241 [2024-05-14 11:50:18.203858] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 241658cd-0150-4b50-8fd9-b260c26dfc67 '!=' 241658cd-0150-4b50-8fd9-b260c26dfc67 ']' 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1697849 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1697849 ']' 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1697849 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1697849 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1697849' 00:13:51.241 killing process with pid 1697849 00:13:51.241 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1697849 00:13:51.241 [2024-05-14 11:50:18.271575] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:51.241 [2024-05-14 11:50:18.271648] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:51.242 [2024-05-14 11:50:18.271708] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:51.242 [2024-05-14 11:50:18.271721] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x206d7b0 name raid_bdev1, state offline 00:13:51.242 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1697849 00:13:51.242 [2024-05-14 11:50:18.302797] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:51.501 11:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:13:51.501 00:13:51.501 real 0m13.396s 00:13:51.501 user 0m24.115s 00:13:51.501 sys 0m2.403s 00:13:51.501 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:51.501 11:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.501 ************************************ 00:13:51.501 END TEST raid_superblock_test 00:13:51.501 ************************************ 00:13:51.501 11:50:18 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:13:51.501 11:50:18 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:13:51.501 11:50:18 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:51.501 11:50:18 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:51.501 11:50:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:51.762 ************************************ 00:13:51.762 START TEST raid_state_function_test 00:13:51.762 ************************************ 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 false 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1699898 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1699898' 00:13:51.762 Process raid pid: 1699898 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1699898 /var/tmp/spdk-raid.sock 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1699898 ']' 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:51.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:51.762 11:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.762 [2024-05-14 11:50:18.683184] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:13:51.762 [2024-05-14 11:50:18.683245] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:51.762 [2024-05-14 11:50:18.812062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:52.022 [2024-05-14 11:50:18.918526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:52.022 [2024-05-14 11:50:18.982733] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.022 [2024-05-14 11:50:18.982762] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.591 11:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:52.591 11:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:13:52.591 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:52.851 [2024-05-14 11:50:19.821355] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:52.851 [2024-05-14 11:50:19.821397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:52.851 [2024-05-14 11:50:19.821415] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:52.851 [2024-05-14 11:50:19.821427] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:52.851 [2024-05-14 11:50:19.821436] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:52.851 [2024-05-14 11:50:19.821447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.851 11:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.111 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:53.111 "name": "Existed_Raid", 00:13:53.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.111 "strip_size_kb": 0, 00:13:53.111 "state": "configuring", 00:13:53.111 "raid_level": "raid1", 00:13:53.111 "superblock": false, 00:13:53.111 "num_base_bdevs": 3, 00:13:53.111 "num_base_bdevs_discovered": 0, 00:13:53.111 "num_base_bdevs_operational": 3, 00:13:53.111 "base_bdevs_list": [ 00:13:53.111 { 00:13:53.111 "name": "BaseBdev1", 00:13:53.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.111 "is_configured": false, 00:13:53.111 "data_offset": 0, 00:13:53.111 "data_size": 0 00:13:53.111 }, 00:13:53.111 { 00:13:53.111 "name": "BaseBdev2", 00:13:53.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.111 "is_configured": false, 00:13:53.111 "data_offset": 0, 00:13:53.111 "data_size": 0 00:13:53.111 }, 00:13:53.111 { 00:13:53.111 "name": "BaseBdev3", 00:13:53.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.111 "is_configured": false, 00:13:53.111 "data_offset": 0, 00:13:53.111 "data_size": 0 00:13:53.111 } 00:13:53.111 ] 00:13:53.111 }' 00:13:53.111 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:53.111 11:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.680 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.939 [2024-05-14 11:50:20.904178] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.939 [2024-05-14 11:50:20.904213] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4a700 name Existed_Raid, state configuring 00:13:53.939 11:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:54.199 [2024-05-14 11:50:21.148846] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:54.199 [2024-05-14 11:50:21.148883] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:54.199 [2024-05-14 11:50:21.148894] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:54.199 [2024-05-14 11:50:21.148905] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:54.199 [2024-05-14 11:50:21.148914] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:54.199 [2024-05-14 11:50:21.148925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:54.199 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:54.458 [2024-05-14 11:50:21.407363] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:54.458 BaseBdev1 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:54.458 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:54.718 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:54.978 [ 00:13:54.978 { 00:13:54.978 "name": "BaseBdev1", 00:13:54.978 "aliases": [ 00:13:54.978 "4072fd4d-f8bb-44c3-890c-8658daa1df69" 00:13:54.978 ], 00:13:54.978 "product_name": "Malloc disk", 00:13:54.978 "block_size": 512, 00:13:54.978 "num_blocks": 65536, 00:13:54.978 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:13:54.978 "assigned_rate_limits": { 00:13:54.978 "rw_ios_per_sec": 0, 00:13:54.978 "rw_mbytes_per_sec": 0, 00:13:54.978 "r_mbytes_per_sec": 0, 00:13:54.978 "w_mbytes_per_sec": 0 00:13:54.978 }, 00:13:54.978 "claimed": true, 00:13:54.978 "claim_type": "exclusive_write", 00:13:54.978 "zoned": false, 00:13:54.978 "supported_io_types": { 00:13:54.978 "read": true, 00:13:54.978 "write": true, 00:13:54.978 "unmap": true, 00:13:54.978 "write_zeroes": true, 00:13:54.978 "flush": true, 00:13:54.978 "reset": true, 00:13:54.978 "compare": false, 00:13:54.978 "compare_and_write": false, 00:13:54.978 "abort": true, 00:13:54.978 "nvme_admin": false, 00:13:54.978 "nvme_io": false 00:13:54.978 }, 00:13:54.978 "memory_domains": [ 00:13:54.978 { 00:13:54.978 "dma_device_id": "system", 00:13:54.978 "dma_device_type": 1 00:13:54.978 }, 00:13:54.978 { 00:13:54.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.978 "dma_device_type": 2 00:13:54.978 } 00:13:54.978 ], 00:13:54.978 "driver_specific": {} 00:13:54.978 } 00:13:54.978 ] 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.978 11:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.237 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:55.237 "name": "Existed_Raid", 00:13:55.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.237 "strip_size_kb": 0, 00:13:55.237 "state": "configuring", 00:13:55.237 "raid_level": "raid1", 00:13:55.237 "superblock": false, 00:13:55.237 "num_base_bdevs": 3, 00:13:55.237 "num_base_bdevs_discovered": 1, 00:13:55.237 "num_base_bdevs_operational": 3, 00:13:55.237 "base_bdevs_list": [ 00:13:55.237 { 00:13:55.237 "name": "BaseBdev1", 00:13:55.237 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:13:55.237 "is_configured": true, 00:13:55.237 "data_offset": 0, 00:13:55.237 "data_size": 65536 00:13:55.237 }, 00:13:55.237 { 00:13:55.237 "name": "BaseBdev2", 00:13:55.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.237 "is_configured": false, 00:13:55.237 "data_offset": 0, 00:13:55.237 "data_size": 0 00:13:55.237 }, 00:13:55.237 { 00:13:55.237 "name": "BaseBdev3", 00:13:55.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.237 "is_configured": false, 00:13:55.237 "data_offset": 0, 00:13:55.237 "data_size": 0 00:13:55.237 } 00:13:55.237 ] 00:13:55.237 }' 00:13:55.237 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:55.237 11:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.806 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:56.066 [2024-05-14 11:50:22.963492] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:56.066 [2024-05-14 11:50:22.963532] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf49ff0 name Existed_Raid, state configuring 00:13:56.066 11:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.066 [2024-05-14 11:50:23.143997] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:56.066 [2024-05-14 11:50:23.145476] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:56.066 [2024-05-14 11:50:23.145508] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:56.066 [2024-05-14 11:50:23.145518] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:56.066 [2024-05-14 11:50:23.145530] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:56.356 "name": "Existed_Raid", 00:13:56.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.356 "strip_size_kb": 0, 00:13:56.356 "state": "configuring", 00:13:56.356 "raid_level": "raid1", 00:13:56.356 "superblock": false, 00:13:56.356 "num_base_bdevs": 3, 00:13:56.356 "num_base_bdevs_discovered": 1, 00:13:56.356 "num_base_bdevs_operational": 3, 00:13:56.356 "base_bdevs_list": [ 00:13:56.356 { 00:13:56.356 "name": "BaseBdev1", 00:13:56.356 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:13:56.356 "is_configured": true, 00:13:56.356 "data_offset": 0, 00:13:56.356 "data_size": 65536 00:13:56.356 }, 00:13:56.356 { 00:13:56.356 "name": "BaseBdev2", 00:13:56.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.356 "is_configured": false, 00:13:56.356 "data_offset": 0, 00:13:56.356 "data_size": 0 00:13:56.356 }, 00:13:56.356 { 00:13:56.356 "name": "BaseBdev3", 00:13:56.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.356 "is_configured": false, 00:13:56.356 "data_offset": 0, 00:13:56.356 "data_size": 0 00:13:56.356 } 00:13:56.356 ] 00:13:56.356 }' 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:56.356 11:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:57.295 [2024-05-14 11:50:24.174148] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:57.295 BaseBdev2 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.295 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:57.554 [ 00:13:57.554 { 00:13:57.554 "name": "BaseBdev2", 00:13:57.554 "aliases": [ 00:13:57.554 "83341d57-3f67-40e4-9099-40580d6042af" 00:13:57.554 ], 00:13:57.554 "product_name": "Malloc disk", 00:13:57.554 "block_size": 512, 00:13:57.554 "num_blocks": 65536, 00:13:57.554 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:13:57.554 "assigned_rate_limits": { 00:13:57.554 "rw_ios_per_sec": 0, 00:13:57.554 "rw_mbytes_per_sec": 0, 00:13:57.554 "r_mbytes_per_sec": 0, 00:13:57.554 "w_mbytes_per_sec": 0 00:13:57.554 }, 00:13:57.554 "claimed": true, 00:13:57.554 "claim_type": "exclusive_write", 00:13:57.554 "zoned": false, 00:13:57.554 "supported_io_types": { 00:13:57.554 "read": true, 00:13:57.554 "write": true, 00:13:57.554 "unmap": true, 00:13:57.554 "write_zeroes": true, 00:13:57.554 "flush": true, 00:13:57.554 "reset": true, 00:13:57.554 "compare": false, 00:13:57.554 "compare_and_write": false, 00:13:57.554 "abort": true, 00:13:57.554 "nvme_admin": false, 00:13:57.554 "nvme_io": false 00:13:57.554 }, 00:13:57.554 "memory_domains": [ 00:13:57.554 { 00:13:57.554 "dma_device_id": "system", 00:13:57.554 "dma_device_type": 1 00:13:57.554 }, 00:13:57.554 { 00:13:57.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.554 "dma_device_type": 2 00:13:57.554 } 00:13:57.554 ], 00:13:57.555 "driver_specific": {} 00:13:57.555 } 00:13:57.555 ] 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.555 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.814 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:57.814 "name": "Existed_Raid", 00:13:57.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.814 "strip_size_kb": 0, 00:13:57.814 "state": "configuring", 00:13:57.814 "raid_level": "raid1", 00:13:57.814 "superblock": false, 00:13:57.814 "num_base_bdevs": 3, 00:13:57.814 "num_base_bdevs_discovered": 2, 00:13:57.814 "num_base_bdevs_operational": 3, 00:13:57.814 "base_bdevs_list": [ 00:13:57.814 { 00:13:57.814 "name": "BaseBdev1", 00:13:57.814 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:13:57.814 "is_configured": true, 00:13:57.814 "data_offset": 0, 00:13:57.814 "data_size": 65536 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "name": "BaseBdev2", 00:13:57.814 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:13:57.814 "is_configured": true, 00:13:57.814 "data_offset": 0, 00:13:57.814 "data_size": 65536 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "name": "BaseBdev3", 00:13:57.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.814 "is_configured": false, 00:13:57.814 "data_offset": 0, 00:13:57.814 "data_size": 0 00:13:57.814 } 00:13:57.814 ] 00:13:57.814 }' 00:13:57.814 11:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:57.814 11:50:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.382 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:58.654 [2024-05-14 11:50:25.485332] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:58.654 [2024-05-14 11:50:25.485371] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xf4b080 00:13:58.654 [2024-05-14 11:50:25.485380] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:58.654 [2024-05-14 11:50:25.485583] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4ad50 00:13:58.654 [2024-05-14 11:50:25.485714] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf4b080 00:13:58.654 [2024-05-14 11:50:25.485725] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf4b080 00:13:58.654 [2024-05-14 11:50:25.485900] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.654 BaseBdev3 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.654 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:58.915 [ 00:13:58.915 { 00:13:58.915 "name": "BaseBdev3", 00:13:58.915 "aliases": [ 00:13:58.915 "a6534e00-5aaa-431b-8258-f1049bba5344" 00:13:58.915 ], 00:13:58.915 "product_name": "Malloc disk", 00:13:58.915 "block_size": 512, 00:13:58.915 "num_blocks": 65536, 00:13:58.915 "uuid": "a6534e00-5aaa-431b-8258-f1049bba5344", 00:13:58.915 "assigned_rate_limits": { 00:13:58.915 "rw_ios_per_sec": 0, 00:13:58.915 "rw_mbytes_per_sec": 0, 00:13:58.915 "r_mbytes_per_sec": 0, 00:13:58.915 "w_mbytes_per_sec": 0 00:13:58.915 }, 00:13:58.915 "claimed": true, 00:13:58.915 "claim_type": "exclusive_write", 00:13:58.915 "zoned": false, 00:13:58.915 "supported_io_types": { 00:13:58.915 "read": true, 00:13:58.915 "write": true, 00:13:58.915 "unmap": true, 00:13:58.915 "write_zeroes": true, 00:13:58.915 "flush": true, 00:13:58.915 "reset": true, 00:13:58.915 "compare": false, 00:13:58.915 "compare_and_write": false, 00:13:58.915 "abort": true, 00:13:58.915 "nvme_admin": false, 00:13:58.915 "nvme_io": false 00:13:58.915 }, 00:13:58.915 "memory_domains": [ 00:13:58.915 { 00:13:58.915 "dma_device_id": "system", 00:13:58.915 "dma_device_type": 1 00:13:58.915 }, 00:13:58.915 { 00:13:58.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.915 "dma_device_type": 2 00:13:58.915 } 00:13:58.915 ], 00:13:58.915 "driver_specific": {} 00:13:58.915 } 00:13:58.915 ] 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.915 11:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.175 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:59.175 "name": "Existed_Raid", 00:13:59.175 "uuid": "5ea9aea4-ffd6-488b-b818-2bc525fe822c", 00:13:59.175 "strip_size_kb": 0, 00:13:59.175 "state": "online", 00:13:59.175 "raid_level": "raid1", 00:13:59.175 "superblock": false, 00:13:59.175 "num_base_bdevs": 3, 00:13:59.175 "num_base_bdevs_discovered": 3, 00:13:59.175 "num_base_bdevs_operational": 3, 00:13:59.175 "base_bdevs_list": [ 00:13:59.175 { 00:13:59.175 "name": "BaseBdev1", 00:13:59.175 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:13:59.175 "is_configured": true, 00:13:59.175 "data_offset": 0, 00:13:59.175 "data_size": 65536 00:13:59.175 }, 00:13:59.175 { 00:13:59.175 "name": "BaseBdev2", 00:13:59.175 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:13:59.175 "is_configured": true, 00:13:59.175 "data_offset": 0, 00:13:59.175 "data_size": 65536 00:13:59.175 }, 00:13:59.175 { 00:13:59.175 "name": "BaseBdev3", 00:13:59.175 "uuid": "a6534e00-5aaa-431b-8258-f1049bba5344", 00:13:59.175 "is_configured": true, 00:13:59.175 "data_offset": 0, 00:13:59.175 "data_size": 65536 00:13:59.175 } 00:13:59.175 ] 00:13:59.175 }' 00:13:59.175 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:59.175 11:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:59.743 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:00.002 [2024-05-14 11:50:26.977570] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:00.002 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:00.002 "name": "Existed_Raid", 00:14:00.002 "aliases": [ 00:14:00.002 "5ea9aea4-ffd6-488b-b818-2bc525fe822c" 00:14:00.002 ], 00:14:00.002 "product_name": "Raid Volume", 00:14:00.002 "block_size": 512, 00:14:00.002 "num_blocks": 65536, 00:14:00.002 "uuid": "5ea9aea4-ffd6-488b-b818-2bc525fe822c", 00:14:00.002 "assigned_rate_limits": { 00:14:00.002 "rw_ios_per_sec": 0, 00:14:00.002 "rw_mbytes_per_sec": 0, 00:14:00.002 "r_mbytes_per_sec": 0, 00:14:00.002 "w_mbytes_per_sec": 0 00:14:00.002 }, 00:14:00.002 "claimed": false, 00:14:00.002 "zoned": false, 00:14:00.002 "supported_io_types": { 00:14:00.003 "read": true, 00:14:00.003 "write": true, 00:14:00.003 "unmap": false, 00:14:00.003 "write_zeroes": true, 00:14:00.003 "flush": false, 00:14:00.003 "reset": true, 00:14:00.003 "compare": false, 00:14:00.003 "compare_and_write": false, 00:14:00.003 "abort": false, 00:14:00.003 "nvme_admin": false, 00:14:00.003 "nvme_io": false 00:14:00.003 }, 00:14:00.003 "memory_domains": [ 00:14:00.003 { 00:14:00.003 "dma_device_id": "system", 00:14:00.003 "dma_device_type": 1 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.003 "dma_device_type": 2 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "dma_device_id": "system", 00:14:00.003 "dma_device_type": 1 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.003 "dma_device_type": 2 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "dma_device_id": "system", 00:14:00.003 "dma_device_type": 1 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.003 "dma_device_type": 2 00:14:00.003 } 00:14:00.003 ], 00:14:00.003 "driver_specific": { 00:14:00.003 "raid": { 00:14:00.003 "uuid": "5ea9aea4-ffd6-488b-b818-2bc525fe822c", 00:14:00.003 "strip_size_kb": 0, 00:14:00.003 "state": "online", 00:14:00.003 "raid_level": "raid1", 00:14:00.003 "superblock": false, 00:14:00.003 "num_base_bdevs": 3, 00:14:00.003 "num_base_bdevs_discovered": 3, 00:14:00.003 "num_base_bdevs_operational": 3, 00:14:00.003 "base_bdevs_list": [ 00:14:00.003 { 00:14:00.003 "name": "BaseBdev1", 00:14:00.003 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:14:00.003 "is_configured": true, 00:14:00.003 "data_offset": 0, 00:14:00.003 "data_size": 65536 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "name": "BaseBdev2", 00:14:00.003 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:14:00.003 "is_configured": true, 00:14:00.003 "data_offset": 0, 00:14:00.003 "data_size": 65536 00:14:00.003 }, 00:14:00.003 { 00:14:00.003 "name": "BaseBdev3", 00:14:00.003 "uuid": "a6534e00-5aaa-431b-8258-f1049bba5344", 00:14:00.003 "is_configured": true, 00:14:00.003 "data_offset": 0, 00:14:00.003 "data_size": 65536 00:14:00.003 } 00:14:00.003 ] 00:14:00.003 } 00:14:00.003 } 00:14:00.003 }' 00:14:00.003 11:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:00.003 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:14:00.003 BaseBdev2 00:14:00.003 BaseBdev3' 00:14:00.003 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:00.003 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:00.003 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:00.262 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:00.262 "name": "BaseBdev1", 00:14:00.262 "aliases": [ 00:14:00.262 "4072fd4d-f8bb-44c3-890c-8658daa1df69" 00:14:00.262 ], 00:14:00.262 "product_name": "Malloc disk", 00:14:00.262 "block_size": 512, 00:14:00.262 "num_blocks": 65536, 00:14:00.262 "uuid": "4072fd4d-f8bb-44c3-890c-8658daa1df69", 00:14:00.262 "assigned_rate_limits": { 00:14:00.262 "rw_ios_per_sec": 0, 00:14:00.262 "rw_mbytes_per_sec": 0, 00:14:00.263 "r_mbytes_per_sec": 0, 00:14:00.263 "w_mbytes_per_sec": 0 00:14:00.263 }, 00:14:00.263 "claimed": true, 00:14:00.263 "claim_type": "exclusive_write", 00:14:00.263 "zoned": false, 00:14:00.263 "supported_io_types": { 00:14:00.263 "read": true, 00:14:00.263 "write": true, 00:14:00.263 "unmap": true, 00:14:00.263 "write_zeroes": true, 00:14:00.263 "flush": true, 00:14:00.263 "reset": true, 00:14:00.263 "compare": false, 00:14:00.263 "compare_and_write": false, 00:14:00.263 "abort": true, 00:14:00.263 "nvme_admin": false, 00:14:00.263 "nvme_io": false 00:14:00.263 }, 00:14:00.263 "memory_domains": [ 00:14:00.263 { 00:14:00.263 "dma_device_id": "system", 00:14:00.263 "dma_device_type": 1 00:14:00.263 }, 00:14:00.263 { 00:14:00.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.263 "dma_device_type": 2 00:14:00.263 } 00:14:00.263 ], 00:14:00.263 "driver_specific": {} 00:14:00.263 }' 00:14:00.263 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:00.263 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:00.522 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:00.523 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:00.782 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:00.782 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:00.782 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:00.782 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:01.041 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:01.041 "name": "BaseBdev2", 00:14:01.041 "aliases": [ 00:14:01.041 "83341d57-3f67-40e4-9099-40580d6042af" 00:14:01.041 ], 00:14:01.041 "product_name": "Malloc disk", 00:14:01.041 "block_size": 512, 00:14:01.041 "num_blocks": 65536, 00:14:01.041 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:14:01.041 "assigned_rate_limits": { 00:14:01.041 "rw_ios_per_sec": 0, 00:14:01.041 "rw_mbytes_per_sec": 0, 00:14:01.041 "r_mbytes_per_sec": 0, 00:14:01.041 "w_mbytes_per_sec": 0 00:14:01.041 }, 00:14:01.041 "claimed": true, 00:14:01.041 "claim_type": "exclusive_write", 00:14:01.041 "zoned": false, 00:14:01.041 "supported_io_types": { 00:14:01.041 "read": true, 00:14:01.041 "write": true, 00:14:01.041 "unmap": true, 00:14:01.041 "write_zeroes": true, 00:14:01.041 "flush": true, 00:14:01.041 "reset": true, 00:14:01.041 "compare": false, 00:14:01.041 "compare_and_write": false, 00:14:01.041 "abort": true, 00:14:01.041 "nvme_admin": false, 00:14:01.041 "nvme_io": false 00:14:01.041 }, 00:14:01.041 "memory_domains": [ 00:14:01.041 { 00:14:01.041 "dma_device_id": "system", 00:14:01.041 "dma_device_type": 1 00:14:01.041 }, 00:14:01.041 { 00:14:01.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.041 "dma_device_type": 2 00:14:01.041 } 00:14:01.041 ], 00:14:01.041 "driver_specific": {} 00:14:01.041 }' 00:14:01.041 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.041 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.041 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:01.041 11:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.041 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.041 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.041 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.041 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:01.300 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:01.560 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:01.561 "name": "BaseBdev3", 00:14:01.561 "aliases": [ 00:14:01.561 "a6534e00-5aaa-431b-8258-f1049bba5344" 00:14:01.561 ], 00:14:01.561 "product_name": "Malloc disk", 00:14:01.561 "block_size": 512, 00:14:01.561 "num_blocks": 65536, 00:14:01.561 "uuid": "a6534e00-5aaa-431b-8258-f1049bba5344", 00:14:01.561 "assigned_rate_limits": { 00:14:01.561 "rw_ios_per_sec": 0, 00:14:01.561 "rw_mbytes_per_sec": 0, 00:14:01.561 "r_mbytes_per_sec": 0, 00:14:01.561 "w_mbytes_per_sec": 0 00:14:01.561 }, 00:14:01.561 "claimed": true, 00:14:01.561 "claim_type": "exclusive_write", 00:14:01.561 "zoned": false, 00:14:01.561 "supported_io_types": { 00:14:01.561 "read": true, 00:14:01.561 "write": true, 00:14:01.561 "unmap": true, 00:14:01.561 "write_zeroes": true, 00:14:01.561 "flush": true, 00:14:01.561 "reset": true, 00:14:01.561 "compare": false, 00:14:01.561 "compare_and_write": false, 00:14:01.561 "abort": true, 00:14:01.561 "nvme_admin": false, 00:14:01.561 "nvme_io": false 00:14:01.561 }, 00:14:01.561 "memory_domains": [ 00:14:01.561 { 00:14:01.561 "dma_device_id": "system", 00:14:01.561 "dma_device_type": 1 00:14:01.561 }, 00:14:01.561 { 00:14:01.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.561 "dma_device_type": 2 00:14:01.561 } 00:14:01.561 ], 00:14:01.561 "driver_specific": {} 00:14:01.561 }' 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.561 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:01.821 11:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.081 [2024-05-14 11:50:29.010746] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.081 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.341 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:02.341 "name": "Existed_Raid", 00:14:02.341 "uuid": "5ea9aea4-ffd6-488b-b818-2bc525fe822c", 00:14:02.341 "strip_size_kb": 0, 00:14:02.341 "state": "online", 00:14:02.341 "raid_level": "raid1", 00:14:02.341 "superblock": false, 00:14:02.341 "num_base_bdevs": 3, 00:14:02.341 "num_base_bdevs_discovered": 2, 00:14:02.341 "num_base_bdevs_operational": 2, 00:14:02.341 "base_bdevs_list": [ 00:14:02.341 { 00:14:02.341 "name": null, 00:14:02.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.341 "is_configured": false, 00:14:02.341 "data_offset": 0, 00:14:02.341 "data_size": 65536 00:14:02.341 }, 00:14:02.341 { 00:14:02.341 "name": "BaseBdev2", 00:14:02.341 "uuid": "83341d57-3f67-40e4-9099-40580d6042af", 00:14:02.341 "is_configured": true, 00:14:02.341 "data_offset": 0, 00:14:02.341 "data_size": 65536 00:14:02.341 }, 00:14:02.341 { 00:14:02.341 "name": "BaseBdev3", 00:14:02.341 "uuid": "a6534e00-5aaa-431b-8258-f1049bba5344", 00:14:02.341 "is_configured": true, 00:14:02.341 "data_offset": 0, 00:14:02.341 "data_size": 65536 00:14:02.341 } 00:14:02.341 ] 00:14:02.341 }' 00:14:02.341 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:02.341 11:50:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.908 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:14:02.908 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:02.908 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.908 11:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:03.168 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:03.168 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:03.168 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:03.428 [2024-05-14 11:50:30.344203] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:03.428 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:03.428 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:03.428 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.428 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:03.688 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:03.688 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:03.688 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:03.947 [2024-05-14 11:50:30.838145] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:03.947 [2024-05-14 11:50:30.838224] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:03.947 [2024-05-14 11:50:30.850841] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:03.947 [2024-05-14 11:50:30.850910] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:03.947 [2024-05-14 11:50:30.850924] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4b080 name Existed_Raid, state offline 00:14:03.947 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:03.947 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:03.947 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.947 11:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:04.207 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:04.466 BaseBdev2 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:04.466 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.726 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:04.985 [ 00:14:04.985 { 00:14:04.985 "name": "BaseBdev2", 00:14:04.985 "aliases": [ 00:14:04.985 "1e920988-45fb-4751-9ded-9fb3261ba3d7" 00:14:04.985 ], 00:14:04.985 "product_name": "Malloc disk", 00:14:04.985 "block_size": 512, 00:14:04.985 "num_blocks": 65536, 00:14:04.985 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:04.985 "assigned_rate_limits": { 00:14:04.985 "rw_ios_per_sec": 0, 00:14:04.985 "rw_mbytes_per_sec": 0, 00:14:04.985 "r_mbytes_per_sec": 0, 00:14:04.985 "w_mbytes_per_sec": 0 00:14:04.985 }, 00:14:04.985 "claimed": false, 00:14:04.985 "zoned": false, 00:14:04.985 "supported_io_types": { 00:14:04.985 "read": true, 00:14:04.985 "write": true, 00:14:04.985 "unmap": true, 00:14:04.985 "write_zeroes": true, 00:14:04.985 "flush": true, 00:14:04.985 "reset": true, 00:14:04.985 "compare": false, 00:14:04.985 "compare_and_write": false, 00:14:04.985 "abort": true, 00:14:04.985 "nvme_admin": false, 00:14:04.985 "nvme_io": false 00:14:04.985 }, 00:14:04.985 "memory_domains": [ 00:14:04.985 { 00:14:04.985 "dma_device_id": "system", 00:14:04.985 "dma_device_type": 1 00:14:04.985 }, 00:14:04.985 { 00:14:04.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.985 "dma_device_type": 2 00:14:04.985 } 00:14:04.985 ], 00:14:04.985 "driver_specific": {} 00:14:04.985 } 00:14:04.985 ] 00:14:04.985 11:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:04.985 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:04.985 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:04.985 11:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:04.985 BaseBdev3 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:04.985 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:05.245 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:05.503 [ 00:14:05.503 { 00:14:05.503 "name": "BaseBdev3", 00:14:05.503 "aliases": [ 00:14:05.503 "c79a3164-5dc3-4ab9-8b24-1b0912986327" 00:14:05.503 ], 00:14:05.503 "product_name": "Malloc disk", 00:14:05.503 "block_size": 512, 00:14:05.503 "num_blocks": 65536, 00:14:05.503 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:05.503 "assigned_rate_limits": { 00:14:05.503 "rw_ios_per_sec": 0, 00:14:05.503 "rw_mbytes_per_sec": 0, 00:14:05.503 "r_mbytes_per_sec": 0, 00:14:05.503 "w_mbytes_per_sec": 0 00:14:05.503 }, 00:14:05.503 "claimed": false, 00:14:05.503 "zoned": false, 00:14:05.503 "supported_io_types": { 00:14:05.503 "read": true, 00:14:05.503 "write": true, 00:14:05.503 "unmap": true, 00:14:05.503 "write_zeroes": true, 00:14:05.503 "flush": true, 00:14:05.503 "reset": true, 00:14:05.503 "compare": false, 00:14:05.503 "compare_and_write": false, 00:14:05.503 "abort": true, 00:14:05.503 "nvme_admin": false, 00:14:05.503 "nvme_io": false 00:14:05.503 }, 00:14:05.503 "memory_domains": [ 00:14:05.503 { 00:14:05.503 "dma_device_id": "system", 00:14:05.503 "dma_device_type": 1 00:14:05.503 }, 00:14:05.503 { 00:14:05.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.503 "dma_device_type": 2 00:14:05.503 } 00:14:05.503 ], 00:14:05.503 "driver_specific": {} 00:14:05.503 } 00:14:05.503 ] 00:14:05.503 11:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:05.503 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:05.503 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:05.503 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:05.762 [2024-05-14 11:50:32.773811] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:05.762 [2024-05-14 11:50:32.773852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:05.762 [2024-05-14 11:50:32.773873] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:05.762 [2024-05-14 11:50:32.775213] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.762 11:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.021 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:06.021 "name": "Existed_Raid", 00:14:06.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.021 "strip_size_kb": 0, 00:14:06.021 "state": "configuring", 00:14:06.021 "raid_level": "raid1", 00:14:06.021 "superblock": false, 00:14:06.021 "num_base_bdevs": 3, 00:14:06.021 "num_base_bdevs_discovered": 2, 00:14:06.021 "num_base_bdevs_operational": 3, 00:14:06.021 "base_bdevs_list": [ 00:14:06.022 { 00:14:06.022 "name": "BaseBdev1", 00:14:06.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.022 "is_configured": false, 00:14:06.022 "data_offset": 0, 00:14:06.022 "data_size": 0 00:14:06.022 }, 00:14:06.022 { 00:14:06.022 "name": "BaseBdev2", 00:14:06.022 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:06.022 "is_configured": true, 00:14:06.022 "data_offset": 0, 00:14:06.022 "data_size": 65536 00:14:06.022 }, 00:14:06.022 { 00:14:06.022 "name": "BaseBdev3", 00:14:06.022 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:06.022 "is_configured": true, 00:14:06.022 "data_offset": 0, 00:14:06.022 "data_size": 65536 00:14:06.022 } 00:14:06.022 ] 00:14:06.022 }' 00:14:06.022 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:06.022 11:50:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.590 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:06.849 [2024-05-14 11:50:33.852641] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.849 11:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.109 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:07.109 "name": "Existed_Raid", 00:14:07.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.109 "strip_size_kb": 0, 00:14:07.109 "state": "configuring", 00:14:07.109 "raid_level": "raid1", 00:14:07.109 "superblock": false, 00:14:07.109 "num_base_bdevs": 3, 00:14:07.109 "num_base_bdevs_discovered": 1, 00:14:07.109 "num_base_bdevs_operational": 3, 00:14:07.109 "base_bdevs_list": [ 00:14:07.109 { 00:14:07.109 "name": "BaseBdev1", 00:14:07.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.109 "is_configured": false, 00:14:07.109 "data_offset": 0, 00:14:07.109 "data_size": 0 00:14:07.109 }, 00:14:07.109 { 00:14:07.109 "name": null, 00:14:07.109 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:07.109 "is_configured": false, 00:14:07.109 "data_offset": 0, 00:14:07.109 "data_size": 65536 00:14:07.109 }, 00:14:07.109 { 00:14:07.109 "name": "BaseBdev3", 00:14:07.109 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:07.109 "is_configured": true, 00:14:07.109 "data_offset": 0, 00:14:07.109 "data_size": 65536 00:14:07.109 } 00:14:07.109 ] 00:14:07.109 }' 00:14:07.109 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:07.109 11:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.678 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.678 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:07.937 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:14:07.937 11:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:08.196 [2024-05-14 11:50:35.123349] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:08.196 BaseBdev1 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:08.196 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:08.456 [ 00:14:08.456 { 00:14:08.456 "name": "BaseBdev1", 00:14:08.456 "aliases": [ 00:14:08.456 "f7bddf0e-3681-42c8-800b-2e6db784ddd1" 00:14:08.456 ], 00:14:08.456 "product_name": "Malloc disk", 00:14:08.456 "block_size": 512, 00:14:08.456 "num_blocks": 65536, 00:14:08.456 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:08.456 "assigned_rate_limits": { 00:14:08.456 "rw_ios_per_sec": 0, 00:14:08.456 "rw_mbytes_per_sec": 0, 00:14:08.456 "r_mbytes_per_sec": 0, 00:14:08.456 "w_mbytes_per_sec": 0 00:14:08.456 }, 00:14:08.456 "claimed": true, 00:14:08.456 "claim_type": "exclusive_write", 00:14:08.456 "zoned": false, 00:14:08.456 "supported_io_types": { 00:14:08.456 "read": true, 00:14:08.456 "write": true, 00:14:08.456 "unmap": true, 00:14:08.456 "write_zeroes": true, 00:14:08.456 "flush": true, 00:14:08.456 "reset": true, 00:14:08.456 "compare": false, 00:14:08.456 "compare_and_write": false, 00:14:08.456 "abort": true, 00:14:08.456 "nvme_admin": false, 00:14:08.456 "nvme_io": false 00:14:08.456 }, 00:14:08.456 "memory_domains": [ 00:14:08.456 { 00:14:08.456 "dma_device_id": "system", 00:14:08.456 "dma_device_type": 1 00:14:08.456 }, 00:14:08.456 { 00:14:08.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.456 "dma_device_type": 2 00:14:08.456 } 00:14:08.456 ], 00:14:08.456 "driver_specific": {} 00:14:08.456 } 00:14:08.456 ] 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.456 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.715 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:08.715 "name": "Existed_Raid", 00:14:08.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.715 "strip_size_kb": 0, 00:14:08.715 "state": "configuring", 00:14:08.715 "raid_level": "raid1", 00:14:08.715 "superblock": false, 00:14:08.715 "num_base_bdevs": 3, 00:14:08.715 "num_base_bdevs_discovered": 2, 00:14:08.715 "num_base_bdevs_operational": 3, 00:14:08.715 "base_bdevs_list": [ 00:14:08.715 { 00:14:08.715 "name": "BaseBdev1", 00:14:08.715 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:08.715 "is_configured": true, 00:14:08.715 "data_offset": 0, 00:14:08.715 "data_size": 65536 00:14:08.715 }, 00:14:08.715 { 00:14:08.715 "name": null, 00:14:08.716 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:08.716 "is_configured": false, 00:14:08.716 "data_offset": 0, 00:14:08.716 "data_size": 65536 00:14:08.716 }, 00:14:08.716 { 00:14:08.716 "name": "BaseBdev3", 00:14:08.716 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:08.716 "is_configured": true, 00:14:08.716 "data_offset": 0, 00:14:08.716 "data_size": 65536 00:14:08.716 } 00:14:08.716 ] 00:14:08.716 }' 00:14:08.716 11:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:08.716 11:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.284 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.284 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:09.543 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:09.543 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:09.803 [2024-05-14 11:50:36.815883] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.803 11:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.062 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:10.062 "name": "Existed_Raid", 00:14:10.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.062 "strip_size_kb": 0, 00:14:10.062 "state": "configuring", 00:14:10.062 "raid_level": "raid1", 00:14:10.062 "superblock": false, 00:14:10.062 "num_base_bdevs": 3, 00:14:10.062 "num_base_bdevs_discovered": 1, 00:14:10.062 "num_base_bdevs_operational": 3, 00:14:10.062 "base_bdevs_list": [ 00:14:10.062 { 00:14:10.062 "name": "BaseBdev1", 00:14:10.062 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:10.062 "is_configured": true, 00:14:10.062 "data_offset": 0, 00:14:10.062 "data_size": 65536 00:14:10.062 }, 00:14:10.062 { 00:14:10.062 "name": null, 00:14:10.062 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:10.062 "is_configured": false, 00:14:10.062 "data_offset": 0, 00:14:10.062 "data_size": 65536 00:14:10.062 }, 00:14:10.062 { 00:14:10.062 "name": null, 00:14:10.062 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:10.062 "is_configured": false, 00:14:10.062 "data_offset": 0, 00:14:10.062 "data_size": 65536 00:14:10.062 } 00:14:10.062 ] 00:14:10.062 }' 00:14:10.062 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:10.062 11:50:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.677 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.677 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:10.955 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:10.956 11:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:11.216 [2024-05-14 11:50:38.107327] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:11.216 "name": "Existed_Raid", 00:14:11.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.216 "strip_size_kb": 0, 00:14:11.216 "state": "configuring", 00:14:11.216 "raid_level": "raid1", 00:14:11.216 "superblock": false, 00:14:11.216 "num_base_bdevs": 3, 00:14:11.216 "num_base_bdevs_discovered": 2, 00:14:11.216 "num_base_bdevs_operational": 3, 00:14:11.216 "base_bdevs_list": [ 00:14:11.216 { 00:14:11.216 "name": "BaseBdev1", 00:14:11.216 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:11.216 "is_configured": true, 00:14:11.216 "data_offset": 0, 00:14:11.216 "data_size": 65536 00:14:11.216 }, 00:14:11.216 { 00:14:11.216 "name": null, 00:14:11.216 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:11.216 "is_configured": false, 00:14:11.216 "data_offset": 0, 00:14:11.216 "data_size": 65536 00:14:11.216 }, 00:14:11.216 { 00:14:11.216 "name": "BaseBdev3", 00:14:11.216 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:11.216 "is_configured": true, 00:14:11.216 "data_offset": 0, 00:14:11.216 "data_size": 65536 00:14:11.216 } 00:14:11.216 ] 00:14:11.216 }' 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:11.216 11:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.784 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.784 11:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:12.043 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:12.043 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:12.302 [2024-05-14 11:50:39.330580] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.302 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.562 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:12.562 "name": "Existed_Raid", 00:14:12.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.562 "strip_size_kb": 0, 00:14:12.562 "state": "configuring", 00:14:12.562 "raid_level": "raid1", 00:14:12.562 "superblock": false, 00:14:12.562 "num_base_bdevs": 3, 00:14:12.562 "num_base_bdevs_discovered": 1, 00:14:12.562 "num_base_bdevs_operational": 3, 00:14:12.562 "base_bdevs_list": [ 00:14:12.562 { 00:14:12.562 "name": null, 00:14:12.562 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:12.562 "is_configured": false, 00:14:12.562 "data_offset": 0, 00:14:12.562 "data_size": 65536 00:14:12.562 }, 00:14:12.562 { 00:14:12.562 "name": null, 00:14:12.562 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:12.562 "is_configured": false, 00:14:12.562 "data_offset": 0, 00:14:12.562 "data_size": 65536 00:14:12.562 }, 00:14:12.562 { 00:14:12.562 "name": "BaseBdev3", 00:14:12.562 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:12.562 "is_configured": true, 00:14:12.562 "data_offset": 0, 00:14:12.562 "data_size": 65536 00:14:12.562 } 00:14:12.562 ] 00:14:12.562 }' 00:14:12.562 11:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:12.562 11:50:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.130 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.130 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:13.388 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:13.389 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:13.648 [2024-05-14 11:50:40.578272] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.648 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.906 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:13.906 "name": "Existed_Raid", 00:14:13.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.906 "strip_size_kb": 0, 00:14:13.906 "state": "configuring", 00:14:13.907 "raid_level": "raid1", 00:14:13.907 "superblock": false, 00:14:13.907 "num_base_bdevs": 3, 00:14:13.907 "num_base_bdevs_discovered": 2, 00:14:13.907 "num_base_bdevs_operational": 3, 00:14:13.907 "base_bdevs_list": [ 00:14:13.907 { 00:14:13.907 "name": null, 00:14:13.907 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:13.907 "is_configured": false, 00:14:13.907 "data_offset": 0, 00:14:13.907 "data_size": 65536 00:14:13.907 }, 00:14:13.907 { 00:14:13.907 "name": "BaseBdev2", 00:14:13.907 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:13.907 "is_configured": true, 00:14:13.907 "data_offset": 0, 00:14:13.907 "data_size": 65536 00:14:13.907 }, 00:14:13.907 { 00:14:13.907 "name": "BaseBdev3", 00:14:13.907 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:13.907 "is_configured": true, 00:14:13.907 "data_offset": 0, 00:14:13.907 "data_size": 65536 00:14:13.907 } 00:14:13.907 ] 00:14:13.907 }' 00:14:13.907 11:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:13.907 11:50:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.473 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.473 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:14.732 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:14.732 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.732 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:14.992 11:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f7bddf0e-3681-42c8-800b-2e6db784ddd1 00:14:15.251 [2024-05-14 11:50:42.163006] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:15.251 [2024-05-14 11:50:42.163048] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x10ef790 00:14:15.251 [2024-05-14 11:50:42.163056] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:15.251 [2024-05-14 11:50:42.163258] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe64df0 00:14:15.251 [2024-05-14 11:50:42.163382] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10ef790 00:14:15.251 [2024-05-14 11:50:42.163393] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10ef790 00:14:15.251 [2024-05-14 11:50:42.163567] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:15.251 NewBaseBdev 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:15.251 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:15.510 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:15.769 [ 00:14:15.769 { 00:14:15.769 "name": "NewBaseBdev", 00:14:15.769 "aliases": [ 00:14:15.769 "f7bddf0e-3681-42c8-800b-2e6db784ddd1" 00:14:15.769 ], 00:14:15.769 "product_name": "Malloc disk", 00:14:15.769 "block_size": 512, 00:14:15.769 "num_blocks": 65536, 00:14:15.769 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:15.769 "assigned_rate_limits": { 00:14:15.769 "rw_ios_per_sec": 0, 00:14:15.769 "rw_mbytes_per_sec": 0, 00:14:15.769 "r_mbytes_per_sec": 0, 00:14:15.769 "w_mbytes_per_sec": 0 00:14:15.769 }, 00:14:15.769 "claimed": true, 00:14:15.769 "claim_type": "exclusive_write", 00:14:15.769 "zoned": false, 00:14:15.769 "supported_io_types": { 00:14:15.769 "read": true, 00:14:15.769 "write": true, 00:14:15.769 "unmap": true, 00:14:15.769 "write_zeroes": true, 00:14:15.769 "flush": true, 00:14:15.769 "reset": true, 00:14:15.769 "compare": false, 00:14:15.769 "compare_and_write": false, 00:14:15.769 "abort": true, 00:14:15.769 "nvme_admin": false, 00:14:15.769 "nvme_io": false 00:14:15.769 }, 00:14:15.769 "memory_domains": [ 00:14:15.769 { 00:14:15.769 "dma_device_id": "system", 00:14:15.769 "dma_device_type": 1 00:14:15.769 }, 00:14:15.769 { 00:14:15.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.769 "dma_device_type": 2 00:14:15.769 } 00:14:15.769 ], 00:14:15.769 "driver_specific": {} 00:14:15.769 } 00:14:15.769 ] 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.769 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:16.027 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:16.027 "name": "Existed_Raid", 00:14:16.027 "uuid": "0e824b70-f3d4-4018-a5c9-702a020e4cfd", 00:14:16.027 "strip_size_kb": 0, 00:14:16.027 "state": "online", 00:14:16.027 "raid_level": "raid1", 00:14:16.027 "superblock": false, 00:14:16.027 "num_base_bdevs": 3, 00:14:16.027 "num_base_bdevs_discovered": 3, 00:14:16.027 "num_base_bdevs_operational": 3, 00:14:16.027 "base_bdevs_list": [ 00:14:16.027 { 00:14:16.027 "name": "NewBaseBdev", 00:14:16.027 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:16.027 "is_configured": true, 00:14:16.027 "data_offset": 0, 00:14:16.027 "data_size": 65536 00:14:16.027 }, 00:14:16.027 { 00:14:16.027 "name": "BaseBdev2", 00:14:16.027 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:16.027 "is_configured": true, 00:14:16.027 "data_offset": 0, 00:14:16.027 "data_size": 65536 00:14:16.027 }, 00:14:16.027 { 00:14:16.027 "name": "BaseBdev3", 00:14:16.027 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:16.027 "is_configured": true, 00:14:16.027 "data_offset": 0, 00:14:16.027 "data_size": 65536 00:14:16.027 } 00:14:16.027 ] 00:14:16.027 }' 00:14:16.027 11:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:16.027 11:50:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:16.594 [2024-05-14 11:50:43.623139] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:16.594 "name": "Existed_Raid", 00:14:16.594 "aliases": [ 00:14:16.594 "0e824b70-f3d4-4018-a5c9-702a020e4cfd" 00:14:16.594 ], 00:14:16.594 "product_name": "Raid Volume", 00:14:16.594 "block_size": 512, 00:14:16.594 "num_blocks": 65536, 00:14:16.594 "uuid": "0e824b70-f3d4-4018-a5c9-702a020e4cfd", 00:14:16.594 "assigned_rate_limits": { 00:14:16.594 "rw_ios_per_sec": 0, 00:14:16.594 "rw_mbytes_per_sec": 0, 00:14:16.594 "r_mbytes_per_sec": 0, 00:14:16.594 "w_mbytes_per_sec": 0 00:14:16.594 }, 00:14:16.594 "claimed": false, 00:14:16.594 "zoned": false, 00:14:16.594 "supported_io_types": { 00:14:16.594 "read": true, 00:14:16.594 "write": true, 00:14:16.594 "unmap": false, 00:14:16.594 "write_zeroes": true, 00:14:16.594 "flush": false, 00:14:16.594 "reset": true, 00:14:16.594 "compare": false, 00:14:16.594 "compare_and_write": false, 00:14:16.594 "abort": false, 00:14:16.594 "nvme_admin": false, 00:14:16.594 "nvme_io": false 00:14:16.594 }, 00:14:16.594 "memory_domains": [ 00:14:16.594 { 00:14:16.594 "dma_device_id": "system", 00:14:16.594 "dma_device_type": 1 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.594 "dma_device_type": 2 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "dma_device_id": "system", 00:14:16.594 "dma_device_type": 1 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.594 "dma_device_type": 2 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "dma_device_id": "system", 00:14:16.594 "dma_device_type": 1 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.594 "dma_device_type": 2 00:14:16.594 } 00:14:16.594 ], 00:14:16.594 "driver_specific": { 00:14:16.594 "raid": { 00:14:16.594 "uuid": "0e824b70-f3d4-4018-a5c9-702a020e4cfd", 00:14:16.594 "strip_size_kb": 0, 00:14:16.594 "state": "online", 00:14:16.594 "raid_level": "raid1", 00:14:16.594 "superblock": false, 00:14:16.594 "num_base_bdevs": 3, 00:14:16.594 "num_base_bdevs_discovered": 3, 00:14:16.594 "num_base_bdevs_operational": 3, 00:14:16.594 "base_bdevs_list": [ 00:14:16.594 { 00:14:16.594 "name": "NewBaseBdev", 00:14:16.594 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:16.594 "is_configured": true, 00:14:16.594 "data_offset": 0, 00:14:16.594 "data_size": 65536 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "name": "BaseBdev2", 00:14:16.594 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:16.594 "is_configured": true, 00:14:16.594 "data_offset": 0, 00:14:16.594 "data_size": 65536 00:14:16.594 }, 00:14:16.594 { 00:14:16.594 "name": "BaseBdev3", 00:14:16.594 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:16.594 "is_configured": true, 00:14:16.594 "data_offset": 0, 00:14:16.594 "data_size": 65536 00:14:16.594 } 00:14:16.594 ] 00:14:16.594 } 00:14:16.594 } 00:14:16.594 }' 00:14:16.594 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:16.852 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:16.852 BaseBdev2 00:14:16.852 BaseBdev3' 00:14:16.852 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:16.852 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:16.852 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:17.111 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:17.111 "name": "NewBaseBdev", 00:14:17.111 "aliases": [ 00:14:17.111 "f7bddf0e-3681-42c8-800b-2e6db784ddd1" 00:14:17.111 ], 00:14:17.111 "product_name": "Malloc disk", 00:14:17.111 "block_size": 512, 00:14:17.111 "num_blocks": 65536, 00:14:17.111 "uuid": "f7bddf0e-3681-42c8-800b-2e6db784ddd1", 00:14:17.111 "assigned_rate_limits": { 00:14:17.111 "rw_ios_per_sec": 0, 00:14:17.111 "rw_mbytes_per_sec": 0, 00:14:17.111 "r_mbytes_per_sec": 0, 00:14:17.111 "w_mbytes_per_sec": 0 00:14:17.111 }, 00:14:17.111 "claimed": true, 00:14:17.111 "claim_type": "exclusive_write", 00:14:17.111 "zoned": false, 00:14:17.111 "supported_io_types": { 00:14:17.111 "read": true, 00:14:17.111 "write": true, 00:14:17.111 "unmap": true, 00:14:17.111 "write_zeroes": true, 00:14:17.111 "flush": true, 00:14:17.111 "reset": true, 00:14:17.111 "compare": false, 00:14:17.111 "compare_and_write": false, 00:14:17.111 "abort": true, 00:14:17.111 "nvme_admin": false, 00:14:17.111 "nvme_io": false 00:14:17.111 }, 00:14:17.111 "memory_domains": [ 00:14:17.111 { 00:14:17.111 "dma_device_id": "system", 00:14:17.111 "dma_device_type": 1 00:14:17.111 }, 00:14:17.111 { 00:14:17.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.111 "dma_device_type": 2 00:14:17.111 } 00:14:17.111 ], 00:14:17.111 "driver_specific": {} 00:14:17.111 }' 00:14:17.111 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:17.111 11:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:17.111 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:17.370 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:17.628 "name": "BaseBdev2", 00:14:17.628 "aliases": [ 00:14:17.628 "1e920988-45fb-4751-9ded-9fb3261ba3d7" 00:14:17.628 ], 00:14:17.628 "product_name": "Malloc disk", 00:14:17.628 "block_size": 512, 00:14:17.628 "num_blocks": 65536, 00:14:17.628 "uuid": "1e920988-45fb-4751-9ded-9fb3261ba3d7", 00:14:17.628 "assigned_rate_limits": { 00:14:17.628 "rw_ios_per_sec": 0, 00:14:17.628 "rw_mbytes_per_sec": 0, 00:14:17.628 "r_mbytes_per_sec": 0, 00:14:17.628 "w_mbytes_per_sec": 0 00:14:17.628 }, 00:14:17.628 "claimed": true, 00:14:17.628 "claim_type": "exclusive_write", 00:14:17.628 "zoned": false, 00:14:17.628 "supported_io_types": { 00:14:17.628 "read": true, 00:14:17.628 "write": true, 00:14:17.628 "unmap": true, 00:14:17.628 "write_zeroes": true, 00:14:17.628 "flush": true, 00:14:17.628 "reset": true, 00:14:17.628 "compare": false, 00:14:17.628 "compare_and_write": false, 00:14:17.628 "abort": true, 00:14:17.628 "nvme_admin": false, 00:14:17.628 "nvme_io": false 00:14:17.628 }, 00:14:17.628 "memory_domains": [ 00:14:17.628 { 00:14:17.628 "dma_device_id": "system", 00:14:17.628 "dma_device_type": 1 00:14:17.628 }, 00:14:17.628 { 00:14:17.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.628 "dma_device_type": 2 00:14:17.628 } 00:14:17.628 ], 00:14:17.628 "driver_specific": {} 00:14:17.628 }' 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:17.628 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:17.888 11:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:18.147 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:18.147 "name": "BaseBdev3", 00:14:18.148 "aliases": [ 00:14:18.148 "c79a3164-5dc3-4ab9-8b24-1b0912986327" 00:14:18.148 ], 00:14:18.148 "product_name": "Malloc disk", 00:14:18.148 "block_size": 512, 00:14:18.148 "num_blocks": 65536, 00:14:18.148 "uuid": "c79a3164-5dc3-4ab9-8b24-1b0912986327", 00:14:18.148 "assigned_rate_limits": { 00:14:18.148 "rw_ios_per_sec": 0, 00:14:18.148 "rw_mbytes_per_sec": 0, 00:14:18.148 "r_mbytes_per_sec": 0, 00:14:18.148 "w_mbytes_per_sec": 0 00:14:18.148 }, 00:14:18.148 "claimed": true, 00:14:18.148 "claim_type": "exclusive_write", 00:14:18.148 "zoned": false, 00:14:18.148 "supported_io_types": { 00:14:18.148 "read": true, 00:14:18.148 "write": true, 00:14:18.148 "unmap": true, 00:14:18.148 "write_zeroes": true, 00:14:18.148 "flush": true, 00:14:18.148 "reset": true, 00:14:18.148 "compare": false, 00:14:18.148 "compare_and_write": false, 00:14:18.148 "abort": true, 00:14:18.148 "nvme_admin": false, 00:14:18.148 "nvme_io": false 00:14:18.148 }, 00:14:18.148 "memory_domains": [ 00:14:18.148 { 00:14:18.148 "dma_device_id": "system", 00:14:18.148 "dma_device_type": 1 00:14:18.148 }, 00:14:18.148 { 00:14:18.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.148 "dma_device_type": 2 00:14:18.148 } 00:14:18.148 ], 00:14:18.148 "driver_specific": {} 00:14:18.148 }' 00:14:18.148 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:18.148 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:18.148 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:18.406 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:18.665 [2024-05-14 11:50:45.724526] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:18.665 [2024-05-14 11:50:45.724552] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:18.665 [2024-05-14 11:50:45.724602] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:18.665 [2024-05-14 11:50:45.724863] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:18.665 [2024-05-14 11:50:45.724875] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10ef790 name Existed_Raid, state offline 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1699898 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1699898 ']' 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1699898 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:14:18.665 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1699898 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1699898' 00:14:18.925 killing process with pid 1699898 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1699898 00:14:18.925 [2024-05-14 11:50:45.793672] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:18.925 11:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1699898 00:14:18.925 [2024-05-14 11:50:45.824146] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:14:19.183 00:14:19.183 real 0m27.431s 00:14:19.183 user 0m50.244s 00:14:19.183 sys 0m4.998s 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.183 ************************************ 00:14:19.183 END TEST raid_state_function_test 00:14:19.183 ************************************ 00:14:19.183 11:50:46 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:14:19.183 11:50:46 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:14:19.183 11:50:46 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:19.183 11:50:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:19.183 ************************************ 00:14:19.183 START TEST raid_state_function_test_sb 00:14:19.183 ************************************ 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 true 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:19.183 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1704030 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1704030' 00:14:19.184 Process raid pid: 1704030 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1704030 /var/tmp/spdk-raid.sock 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1704030 ']' 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:19.184 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:19.184 11:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.184 [2024-05-14 11:50:46.205433] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:14:19.184 [2024-05-14 11:50:46.205501] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:19.442 [2024-05-14 11:50:46.325876] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:19.442 [2024-05-14 11:50:46.436444] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:19.442 [2024-05-14 11:50:46.497987] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.442 [2024-05-14 11:50:46.498014] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:20.378 [2024-05-14 11:50:47.360768] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:20.378 [2024-05-14 11:50:47.360813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:20.378 [2024-05-14 11:50:47.360824] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:20.378 [2024-05-14 11:50:47.360837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:20.378 [2024-05-14 11:50:47.360846] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:20.378 [2024-05-14 11:50:47.360857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.378 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.636 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:20.636 "name": "Existed_Raid", 00:14:20.636 "uuid": "ed1fb87b-5fa4-4f7e-b7c9-fb9bde3b60fa", 00:14:20.636 "strip_size_kb": 0, 00:14:20.636 "state": "configuring", 00:14:20.636 "raid_level": "raid1", 00:14:20.636 "superblock": true, 00:14:20.636 "num_base_bdevs": 3, 00:14:20.636 "num_base_bdevs_discovered": 0, 00:14:20.636 "num_base_bdevs_operational": 3, 00:14:20.636 "base_bdevs_list": [ 00:14:20.636 { 00:14:20.636 "name": "BaseBdev1", 00:14:20.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.636 "is_configured": false, 00:14:20.636 "data_offset": 0, 00:14:20.636 "data_size": 0 00:14:20.636 }, 00:14:20.636 { 00:14:20.636 "name": "BaseBdev2", 00:14:20.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.636 "is_configured": false, 00:14:20.636 "data_offset": 0, 00:14:20.636 "data_size": 0 00:14:20.636 }, 00:14:20.636 { 00:14:20.636 "name": "BaseBdev3", 00:14:20.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.636 "is_configured": false, 00:14:20.636 "data_offset": 0, 00:14:20.636 "data_size": 0 00:14:20.636 } 00:14:20.636 ] 00:14:20.636 }' 00:14:20.636 11:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:20.636 11:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:21.201 11:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:21.458 [2024-05-14 11:50:48.447504] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:21.458 [2024-05-14 11:50:48.447535] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbaa700 name Existed_Raid, state configuring 00:14:21.458 11:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:21.716 [2024-05-14 11:50:48.692163] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:21.716 [2024-05-14 11:50:48.692191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:21.716 [2024-05-14 11:50:48.692201] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:21.716 [2024-05-14 11:50:48.692212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:21.716 [2024-05-14 11:50:48.692221] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:21.716 [2024-05-14 11:50:48.692233] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:21.716 11:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:21.974 [2024-05-14 11:50:48.946685] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:21.974 BaseBdev1 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:21.974 11:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:22.232 11:50:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:22.491 [ 00:14:22.491 { 00:14:22.491 "name": "BaseBdev1", 00:14:22.491 "aliases": [ 00:14:22.491 "22505ced-5700-4c36-87a0-a38ba9f7f8fb" 00:14:22.491 ], 00:14:22.491 "product_name": "Malloc disk", 00:14:22.491 "block_size": 512, 00:14:22.491 "num_blocks": 65536, 00:14:22.491 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:22.491 "assigned_rate_limits": { 00:14:22.491 "rw_ios_per_sec": 0, 00:14:22.491 "rw_mbytes_per_sec": 0, 00:14:22.491 "r_mbytes_per_sec": 0, 00:14:22.491 "w_mbytes_per_sec": 0 00:14:22.491 }, 00:14:22.491 "claimed": true, 00:14:22.491 "claim_type": "exclusive_write", 00:14:22.491 "zoned": false, 00:14:22.491 "supported_io_types": { 00:14:22.491 "read": true, 00:14:22.491 "write": true, 00:14:22.491 "unmap": true, 00:14:22.491 "write_zeroes": true, 00:14:22.491 "flush": true, 00:14:22.491 "reset": true, 00:14:22.491 "compare": false, 00:14:22.491 "compare_and_write": false, 00:14:22.491 "abort": true, 00:14:22.491 "nvme_admin": false, 00:14:22.491 "nvme_io": false 00:14:22.491 }, 00:14:22.491 "memory_domains": [ 00:14:22.491 { 00:14:22.491 "dma_device_id": "system", 00:14:22.491 "dma_device_type": 1 00:14:22.491 }, 00:14:22.491 { 00:14:22.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.491 "dma_device_type": 2 00:14:22.491 } 00:14:22.491 ], 00:14:22.491 "driver_specific": {} 00:14:22.491 } 00:14:22.491 ] 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.491 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.750 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:22.750 "name": "Existed_Raid", 00:14:22.750 "uuid": "8f9d2067-e406-4d58-b2de-c70c4788bd7a", 00:14:22.750 "strip_size_kb": 0, 00:14:22.750 "state": "configuring", 00:14:22.750 "raid_level": "raid1", 00:14:22.750 "superblock": true, 00:14:22.750 "num_base_bdevs": 3, 00:14:22.750 "num_base_bdevs_discovered": 1, 00:14:22.750 "num_base_bdevs_operational": 3, 00:14:22.750 "base_bdevs_list": [ 00:14:22.750 { 00:14:22.750 "name": "BaseBdev1", 00:14:22.750 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:22.750 "is_configured": true, 00:14:22.750 "data_offset": 2048, 00:14:22.750 "data_size": 63488 00:14:22.750 }, 00:14:22.750 { 00:14:22.750 "name": "BaseBdev2", 00:14:22.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.750 "is_configured": false, 00:14:22.750 "data_offset": 0, 00:14:22.750 "data_size": 0 00:14:22.750 }, 00:14:22.750 { 00:14:22.750 "name": "BaseBdev3", 00:14:22.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.750 "is_configured": false, 00:14:22.750 "data_offset": 0, 00:14:22.750 "data_size": 0 00:14:22.750 } 00:14:22.750 ] 00:14:22.750 }' 00:14:22.750 11:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:22.750 11:50:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.318 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:23.580 [2024-05-14 11:50:50.518827] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:23.580 [2024-05-14 11:50:50.518867] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba9ff0 name Existed_Raid, state configuring 00:14:23.580 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:23.839 [2024-05-14 11:50:50.783569] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:23.839 [2024-05-14 11:50:50.785098] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:23.839 [2024-05-14 11:50:50.785131] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:23.839 [2024-05-14 11:50:50.785141] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:23.839 [2024-05-14 11:50:50.785153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.839 11:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.098 11:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:24.098 "name": "Existed_Raid", 00:14:24.098 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:24.098 "strip_size_kb": 0, 00:14:24.098 "state": "configuring", 00:14:24.098 "raid_level": "raid1", 00:14:24.098 "superblock": true, 00:14:24.098 "num_base_bdevs": 3, 00:14:24.098 "num_base_bdevs_discovered": 1, 00:14:24.098 "num_base_bdevs_operational": 3, 00:14:24.098 "base_bdevs_list": [ 00:14:24.098 { 00:14:24.098 "name": "BaseBdev1", 00:14:24.098 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:24.098 "is_configured": true, 00:14:24.098 "data_offset": 2048, 00:14:24.098 "data_size": 63488 00:14:24.098 }, 00:14:24.098 { 00:14:24.098 "name": "BaseBdev2", 00:14:24.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.098 "is_configured": false, 00:14:24.098 "data_offset": 0, 00:14:24.098 "data_size": 0 00:14:24.098 }, 00:14:24.098 { 00:14:24.098 "name": "BaseBdev3", 00:14:24.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.098 "is_configured": false, 00:14:24.098 "data_offset": 0, 00:14:24.098 "data_size": 0 00:14:24.099 } 00:14:24.099 ] 00:14:24.099 }' 00:14:24.099 11:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:24.099 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.709 11:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:24.980 [2024-05-14 11:50:51.865810] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:24.980 BaseBdev2 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:24.980 11:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.239 11:50:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:25.497 [ 00:14:25.497 { 00:14:25.497 "name": "BaseBdev2", 00:14:25.497 "aliases": [ 00:14:25.497 "b5af0033-701e-4721-89fc-0ae0a3006168" 00:14:25.497 ], 00:14:25.497 "product_name": "Malloc disk", 00:14:25.497 "block_size": 512, 00:14:25.497 "num_blocks": 65536, 00:14:25.497 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:25.497 "assigned_rate_limits": { 00:14:25.497 "rw_ios_per_sec": 0, 00:14:25.497 "rw_mbytes_per_sec": 0, 00:14:25.497 "r_mbytes_per_sec": 0, 00:14:25.497 "w_mbytes_per_sec": 0 00:14:25.497 }, 00:14:25.497 "claimed": true, 00:14:25.497 "claim_type": "exclusive_write", 00:14:25.497 "zoned": false, 00:14:25.497 "supported_io_types": { 00:14:25.497 "read": true, 00:14:25.497 "write": true, 00:14:25.497 "unmap": true, 00:14:25.497 "write_zeroes": true, 00:14:25.497 "flush": true, 00:14:25.497 "reset": true, 00:14:25.497 "compare": false, 00:14:25.497 "compare_and_write": false, 00:14:25.497 "abort": true, 00:14:25.497 "nvme_admin": false, 00:14:25.497 "nvme_io": false 00:14:25.497 }, 00:14:25.497 "memory_domains": [ 00:14:25.497 { 00:14:25.497 "dma_device_id": "system", 00:14:25.497 "dma_device_type": 1 00:14:25.497 }, 00:14:25.497 { 00:14:25.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.497 "dma_device_type": 2 00:14:25.497 } 00:14:25.497 ], 00:14:25.497 "driver_specific": {} 00:14:25.497 } 00:14:25.497 ] 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.497 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.756 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:25.756 "name": "Existed_Raid", 00:14:25.756 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:25.756 "strip_size_kb": 0, 00:14:25.756 "state": "configuring", 00:14:25.756 "raid_level": "raid1", 00:14:25.756 "superblock": true, 00:14:25.756 "num_base_bdevs": 3, 00:14:25.756 "num_base_bdevs_discovered": 2, 00:14:25.756 "num_base_bdevs_operational": 3, 00:14:25.756 "base_bdevs_list": [ 00:14:25.756 { 00:14:25.756 "name": "BaseBdev1", 00:14:25.756 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:25.756 "is_configured": true, 00:14:25.756 "data_offset": 2048, 00:14:25.756 "data_size": 63488 00:14:25.756 }, 00:14:25.756 { 00:14:25.756 "name": "BaseBdev2", 00:14:25.756 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:25.756 "is_configured": true, 00:14:25.756 "data_offset": 2048, 00:14:25.756 "data_size": 63488 00:14:25.756 }, 00:14:25.756 { 00:14:25.756 "name": "BaseBdev3", 00:14:25.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.756 "is_configured": false, 00:14:25.756 "data_offset": 0, 00:14:25.756 "data_size": 0 00:14:25.756 } 00:14:25.756 ] 00:14:25.756 }' 00:14:25.756 11:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:25.756 11:50:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:26.322 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:26.582 [2024-05-14 11:50:53.461525] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:26.582 [2024-05-14 11:50:53.461684] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xbab080 00:14:26.582 [2024-05-14 11:50:53.461703] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:26.582 [2024-05-14 11:50:53.461880] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbaad50 00:14:26.582 [2024-05-14 11:50:53.462005] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbab080 00:14:26.582 [2024-05-14 11:50:53.462015] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbab080 00:14:26.582 [2024-05-14 11:50:53.462114] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.582 BaseBdev3 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:26.582 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.841 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:27.100 [ 00:14:27.100 { 00:14:27.100 "name": "BaseBdev3", 00:14:27.100 "aliases": [ 00:14:27.100 "c609f847-c280-4254-adbb-22eb084109de" 00:14:27.100 ], 00:14:27.100 "product_name": "Malloc disk", 00:14:27.100 "block_size": 512, 00:14:27.100 "num_blocks": 65536, 00:14:27.100 "uuid": "c609f847-c280-4254-adbb-22eb084109de", 00:14:27.100 "assigned_rate_limits": { 00:14:27.100 "rw_ios_per_sec": 0, 00:14:27.100 "rw_mbytes_per_sec": 0, 00:14:27.100 "r_mbytes_per_sec": 0, 00:14:27.100 "w_mbytes_per_sec": 0 00:14:27.100 }, 00:14:27.100 "claimed": true, 00:14:27.100 "claim_type": "exclusive_write", 00:14:27.100 "zoned": false, 00:14:27.100 "supported_io_types": { 00:14:27.100 "read": true, 00:14:27.100 "write": true, 00:14:27.100 "unmap": true, 00:14:27.100 "write_zeroes": true, 00:14:27.100 "flush": true, 00:14:27.100 "reset": true, 00:14:27.100 "compare": false, 00:14:27.100 "compare_and_write": false, 00:14:27.100 "abort": true, 00:14:27.100 "nvme_admin": false, 00:14:27.100 "nvme_io": false 00:14:27.100 }, 00:14:27.100 "memory_domains": [ 00:14:27.100 { 00:14:27.101 "dma_device_id": "system", 00:14:27.101 "dma_device_type": 1 00:14:27.101 }, 00:14:27.101 { 00:14:27.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.101 "dma_device_type": 2 00:14:27.101 } 00:14:27.101 ], 00:14:27.101 "driver_specific": {} 00:14:27.101 } 00:14:27.101 ] 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.101 11:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.360 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:27.360 "name": "Existed_Raid", 00:14:27.360 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:27.360 "strip_size_kb": 0, 00:14:27.360 "state": "online", 00:14:27.360 "raid_level": "raid1", 00:14:27.360 "superblock": true, 00:14:27.360 "num_base_bdevs": 3, 00:14:27.360 "num_base_bdevs_discovered": 3, 00:14:27.360 "num_base_bdevs_operational": 3, 00:14:27.360 "base_bdevs_list": [ 00:14:27.360 { 00:14:27.360 "name": "BaseBdev1", 00:14:27.360 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:27.360 "is_configured": true, 00:14:27.360 "data_offset": 2048, 00:14:27.360 "data_size": 63488 00:14:27.360 }, 00:14:27.360 { 00:14:27.360 "name": "BaseBdev2", 00:14:27.360 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:27.360 "is_configured": true, 00:14:27.360 "data_offset": 2048, 00:14:27.360 "data_size": 63488 00:14:27.360 }, 00:14:27.360 { 00:14:27.360 "name": "BaseBdev3", 00:14:27.360 "uuid": "c609f847-c280-4254-adbb-22eb084109de", 00:14:27.360 "is_configured": true, 00:14:27.360 "data_offset": 2048, 00:14:27.360 "data_size": 63488 00:14:27.360 } 00:14:27.360 ] 00:14:27.360 }' 00:14:27.360 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:27.360 11:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:27.928 11:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:28.188 [2024-05-14 11:50:55.034080] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:28.188 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:28.188 "name": "Existed_Raid", 00:14:28.188 "aliases": [ 00:14:28.188 "6c6e1e0b-a630-4e6b-8573-60852c653bf4" 00:14:28.188 ], 00:14:28.188 "product_name": "Raid Volume", 00:14:28.188 "block_size": 512, 00:14:28.188 "num_blocks": 63488, 00:14:28.188 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:28.188 "assigned_rate_limits": { 00:14:28.188 "rw_ios_per_sec": 0, 00:14:28.188 "rw_mbytes_per_sec": 0, 00:14:28.188 "r_mbytes_per_sec": 0, 00:14:28.188 "w_mbytes_per_sec": 0 00:14:28.188 }, 00:14:28.188 "claimed": false, 00:14:28.188 "zoned": false, 00:14:28.188 "supported_io_types": { 00:14:28.188 "read": true, 00:14:28.188 "write": true, 00:14:28.188 "unmap": false, 00:14:28.188 "write_zeroes": true, 00:14:28.188 "flush": false, 00:14:28.188 "reset": true, 00:14:28.188 "compare": false, 00:14:28.188 "compare_and_write": false, 00:14:28.188 "abort": false, 00:14:28.188 "nvme_admin": false, 00:14:28.188 "nvme_io": false 00:14:28.188 }, 00:14:28.188 "memory_domains": [ 00:14:28.188 { 00:14:28.188 "dma_device_id": "system", 00:14:28.188 "dma_device_type": 1 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.188 "dma_device_type": 2 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "dma_device_id": "system", 00:14:28.188 "dma_device_type": 1 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.188 "dma_device_type": 2 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "dma_device_id": "system", 00:14:28.188 "dma_device_type": 1 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.188 "dma_device_type": 2 00:14:28.188 } 00:14:28.188 ], 00:14:28.188 "driver_specific": { 00:14:28.188 "raid": { 00:14:28.188 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:28.188 "strip_size_kb": 0, 00:14:28.188 "state": "online", 00:14:28.188 "raid_level": "raid1", 00:14:28.188 "superblock": true, 00:14:28.188 "num_base_bdevs": 3, 00:14:28.188 "num_base_bdevs_discovered": 3, 00:14:28.188 "num_base_bdevs_operational": 3, 00:14:28.188 "base_bdevs_list": [ 00:14:28.188 { 00:14:28.188 "name": "BaseBdev1", 00:14:28.188 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:28.188 "is_configured": true, 00:14:28.188 "data_offset": 2048, 00:14:28.188 "data_size": 63488 00:14:28.188 }, 00:14:28.188 { 00:14:28.188 "name": "BaseBdev2", 00:14:28.188 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:28.189 "is_configured": true, 00:14:28.189 "data_offset": 2048, 00:14:28.189 "data_size": 63488 00:14:28.189 }, 00:14:28.189 { 00:14:28.189 "name": "BaseBdev3", 00:14:28.189 "uuid": "c609f847-c280-4254-adbb-22eb084109de", 00:14:28.189 "is_configured": true, 00:14:28.189 "data_offset": 2048, 00:14:28.189 "data_size": 63488 00:14:28.189 } 00:14:28.189 ] 00:14:28.189 } 00:14:28.189 } 00:14:28.189 }' 00:14:28.189 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:28.189 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:14:28.189 BaseBdev2 00:14:28.189 BaseBdev3' 00:14:28.189 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:28.189 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:28.189 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:28.448 "name": "BaseBdev1", 00:14:28.448 "aliases": [ 00:14:28.448 "22505ced-5700-4c36-87a0-a38ba9f7f8fb" 00:14:28.448 ], 00:14:28.448 "product_name": "Malloc disk", 00:14:28.448 "block_size": 512, 00:14:28.448 "num_blocks": 65536, 00:14:28.448 "uuid": "22505ced-5700-4c36-87a0-a38ba9f7f8fb", 00:14:28.448 "assigned_rate_limits": { 00:14:28.448 "rw_ios_per_sec": 0, 00:14:28.448 "rw_mbytes_per_sec": 0, 00:14:28.448 "r_mbytes_per_sec": 0, 00:14:28.448 "w_mbytes_per_sec": 0 00:14:28.448 }, 00:14:28.448 "claimed": true, 00:14:28.448 "claim_type": "exclusive_write", 00:14:28.448 "zoned": false, 00:14:28.448 "supported_io_types": { 00:14:28.448 "read": true, 00:14:28.448 "write": true, 00:14:28.448 "unmap": true, 00:14:28.448 "write_zeroes": true, 00:14:28.448 "flush": true, 00:14:28.448 "reset": true, 00:14:28.448 "compare": false, 00:14:28.448 "compare_and_write": false, 00:14:28.448 "abort": true, 00:14:28.448 "nvme_admin": false, 00:14:28.448 "nvme_io": false 00:14:28.448 }, 00:14:28.448 "memory_domains": [ 00:14:28.448 { 00:14:28.448 "dma_device_id": "system", 00:14:28.448 "dma_device_type": 1 00:14:28.448 }, 00:14:28.448 { 00:14:28.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.448 "dma_device_type": 2 00:14:28.448 } 00:14:28.448 ], 00:14:28.448 "driver_specific": {} 00:14:28.448 }' 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.448 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:28.708 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:28.967 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:28.967 "name": "BaseBdev2", 00:14:28.967 "aliases": [ 00:14:28.967 "b5af0033-701e-4721-89fc-0ae0a3006168" 00:14:28.967 ], 00:14:28.967 "product_name": "Malloc disk", 00:14:28.967 "block_size": 512, 00:14:28.967 "num_blocks": 65536, 00:14:28.967 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:28.967 "assigned_rate_limits": { 00:14:28.967 "rw_ios_per_sec": 0, 00:14:28.967 "rw_mbytes_per_sec": 0, 00:14:28.967 "r_mbytes_per_sec": 0, 00:14:28.967 "w_mbytes_per_sec": 0 00:14:28.967 }, 00:14:28.967 "claimed": true, 00:14:28.967 "claim_type": "exclusive_write", 00:14:28.967 "zoned": false, 00:14:28.967 "supported_io_types": { 00:14:28.967 "read": true, 00:14:28.967 "write": true, 00:14:28.967 "unmap": true, 00:14:28.967 "write_zeroes": true, 00:14:28.967 "flush": true, 00:14:28.967 "reset": true, 00:14:28.967 "compare": false, 00:14:28.967 "compare_and_write": false, 00:14:28.967 "abort": true, 00:14:28.967 "nvme_admin": false, 00:14:28.967 "nvme_io": false 00:14:28.967 }, 00:14:28.967 "memory_domains": [ 00:14:28.967 { 00:14:28.967 "dma_device_id": "system", 00:14:28.967 "dma_device_type": 1 00:14:28.967 }, 00:14:28.967 { 00:14:28.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.967 "dma_device_type": 2 00:14:28.967 } 00:14:28.967 ], 00:14:28.967 "driver_specific": {} 00:14:28.967 }' 00:14:28.967 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:28.967 11:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:28.967 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:28.967 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:28.967 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:29.227 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:29.486 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:29.486 "name": "BaseBdev3", 00:14:29.486 "aliases": [ 00:14:29.486 "c609f847-c280-4254-adbb-22eb084109de" 00:14:29.486 ], 00:14:29.486 "product_name": "Malloc disk", 00:14:29.486 "block_size": 512, 00:14:29.486 "num_blocks": 65536, 00:14:29.486 "uuid": "c609f847-c280-4254-adbb-22eb084109de", 00:14:29.486 "assigned_rate_limits": { 00:14:29.486 "rw_ios_per_sec": 0, 00:14:29.486 "rw_mbytes_per_sec": 0, 00:14:29.486 "r_mbytes_per_sec": 0, 00:14:29.486 "w_mbytes_per_sec": 0 00:14:29.486 }, 00:14:29.486 "claimed": true, 00:14:29.486 "claim_type": "exclusive_write", 00:14:29.486 "zoned": false, 00:14:29.486 "supported_io_types": { 00:14:29.486 "read": true, 00:14:29.486 "write": true, 00:14:29.486 "unmap": true, 00:14:29.486 "write_zeroes": true, 00:14:29.486 "flush": true, 00:14:29.486 "reset": true, 00:14:29.486 "compare": false, 00:14:29.486 "compare_and_write": false, 00:14:29.486 "abort": true, 00:14:29.486 "nvme_admin": false, 00:14:29.486 "nvme_io": false 00:14:29.486 }, 00:14:29.486 "memory_domains": [ 00:14:29.486 { 00:14:29.486 "dma_device_id": "system", 00:14:29.486 "dma_device_type": 1 00:14:29.486 }, 00:14:29.486 { 00:14:29.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.486 "dma_device_type": 2 00:14:29.486 } 00:14:29.486 ], 00:14:29.486 "driver_specific": {} 00:14:29.486 }' 00:14:29.486 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:29.486 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:29.486 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:29.486 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:29.745 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:30.005 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:30.005 11:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:30.005 [2024-05-14 11:50:57.051213] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.005 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.265 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:30.265 "name": "Existed_Raid", 00:14:30.265 "uuid": "6c6e1e0b-a630-4e6b-8573-60852c653bf4", 00:14:30.265 "strip_size_kb": 0, 00:14:30.265 "state": "online", 00:14:30.265 "raid_level": "raid1", 00:14:30.265 "superblock": true, 00:14:30.265 "num_base_bdevs": 3, 00:14:30.265 "num_base_bdevs_discovered": 2, 00:14:30.265 "num_base_bdevs_operational": 2, 00:14:30.265 "base_bdevs_list": [ 00:14:30.265 { 00:14:30.265 "name": null, 00:14:30.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.265 "is_configured": false, 00:14:30.265 "data_offset": 2048, 00:14:30.265 "data_size": 63488 00:14:30.265 }, 00:14:30.265 { 00:14:30.265 "name": "BaseBdev2", 00:14:30.265 "uuid": "b5af0033-701e-4721-89fc-0ae0a3006168", 00:14:30.265 "is_configured": true, 00:14:30.265 "data_offset": 2048, 00:14:30.265 "data_size": 63488 00:14:30.265 }, 00:14:30.265 { 00:14:30.265 "name": "BaseBdev3", 00:14:30.265 "uuid": "c609f847-c280-4254-adbb-22eb084109de", 00:14:30.265 "is_configured": true, 00:14:30.265 "data_offset": 2048, 00:14:30.265 "data_size": 63488 00:14:30.265 } 00:14:30.265 ] 00:14:30.265 }' 00:14:30.265 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:30.265 11:50:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.203 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:14:31.203 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:31.203 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.203 11:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:31.203 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:31.203 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:31.203 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:31.462 [2024-05-14 11:50:58.395903] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:31.462 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:31.462 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:31.462 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.462 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:31.722 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:31.722 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:31.722 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:31.982 [2024-05-14 11:50:58.891981] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:31.982 [2024-05-14 11:50:58.892058] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.982 [2024-05-14 11:50:58.904931] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.982 [2024-05-14 11:50:58.905003] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:31.982 [2024-05-14 11:50:58.905017] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbab080 name Existed_Raid, state offline 00:14:31.982 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:31.982 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:31.982 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.982 11:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:32.241 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:32.501 BaseBdev2 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:32.501 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:32.761 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:33.020 [ 00:14:33.020 { 00:14:33.020 "name": "BaseBdev2", 00:14:33.020 "aliases": [ 00:14:33.020 "06202232-28a7-4497-9f10-e93fc0c589a4" 00:14:33.020 ], 00:14:33.020 "product_name": "Malloc disk", 00:14:33.020 "block_size": 512, 00:14:33.020 "num_blocks": 65536, 00:14:33.020 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:33.020 "assigned_rate_limits": { 00:14:33.020 "rw_ios_per_sec": 0, 00:14:33.020 "rw_mbytes_per_sec": 0, 00:14:33.020 "r_mbytes_per_sec": 0, 00:14:33.020 "w_mbytes_per_sec": 0 00:14:33.020 }, 00:14:33.020 "claimed": false, 00:14:33.020 "zoned": false, 00:14:33.020 "supported_io_types": { 00:14:33.020 "read": true, 00:14:33.020 "write": true, 00:14:33.020 "unmap": true, 00:14:33.020 "write_zeroes": true, 00:14:33.020 "flush": true, 00:14:33.020 "reset": true, 00:14:33.020 "compare": false, 00:14:33.020 "compare_and_write": false, 00:14:33.020 "abort": true, 00:14:33.020 "nvme_admin": false, 00:14:33.020 "nvme_io": false 00:14:33.020 }, 00:14:33.020 "memory_domains": [ 00:14:33.020 { 00:14:33.020 "dma_device_id": "system", 00:14:33.020 "dma_device_type": 1 00:14:33.020 }, 00:14:33.020 { 00:14:33.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.020 "dma_device_type": 2 00:14:33.020 } 00:14:33.020 ], 00:14:33.020 "driver_specific": {} 00:14:33.020 } 00:14:33.020 ] 00:14:33.020 11:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:33.020 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:33.020 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:33.020 11:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:33.279 BaseBdev3 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:33.279 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:33.538 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:33.538 [ 00:14:33.538 { 00:14:33.538 "name": "BaseBdev3", 00:14:33.538 "aliases": [ 00:14:33.538 "e4899f82-c194-4298-8f9c-afda195e4c9e" 00:14:33.538 ], 00:14:33.538 "product_name": "Malloc disk", 00:14:33.538 "block_size": 512, 00:14:33.538 "num_blocks": 65536, 00:14:33.538 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:33.538 "assigned_rate_limits": { 00:14:33.538 "rw_ios_per_sec": 0, 00:14:33.538 "rw_mbytes_per_sec": 0, 00:14:33.538 "r_mbytes_per_sec": 0, 00:14:33.538 "w_mbytes_per_sec": 0 00:14:33.538 }, 00:14:33.538 "claimed": false, 00:14:33.538 "zoned": false, 00:14:33.538 "supported_io_types": { 00:14:33.538 "read": true, 00:14:33.538 "write": true, 00:14:33.538 "unmap": true, 00:14:33.538 "write_zeroes": true, 00:14:33.538 "flush": true, 00:14:33.538 "reset": true, 00:14:33.538 "compare": false, 00:14:33.538 "compare_and_write": false, 00:14:33.538 "abort": true, 00:14:33.538 "nvme_admin": false, 00:14:33.538 "nvme_io": false 00:14:33.538 }, 00:14:33.538 "memory_domains": [ 00:14:33.538 { 00:14:33.538 "dma_device_id": "system", 00:14:33.538 "dma_device_type": 1 00:14:33.538 }, 00:14:33.538 { 00:14:33.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.538 "dma_device_type": 2 00:14:33.538 } 00:14:33.538 ], 00:14:33.538 "driver_specific": {} 00:14:33.538 } 00:14:33.538 ] 00:14:33.538 11:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:33.538 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:33.538 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:33.538 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:33.798 [2024-05-14 11:51:00.759179] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:33.798 [2024-05-14 11:51:00.759221] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:33.798 [2024-05-14 11:51:00.759244] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:33.798 [2024-05-14 11:51:00.760569] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.798 11:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.057 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:34.057 "name": "Existed_Raid", 00:14:34.057 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:34.057 "strip_size_kb": 0, 00:14:34.057 "state": "configuring", 00:14:34.057 "raid_level": "raid1", 00:14:34.057 "superblock": true, 00:14:34.057 "num_base_bdevs": 3, 00:14:34.057 "num_base_bdevs_discovered": 2, 00:14:34.057 "num_base_bdevs_operational": 3, 00:14:34.057 "base_bdevs_list": [ 00:14:34.057 { 00:14:34.057 "name": "BaseBdev1", 00:14:34.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.057 "is_configured": false, 00:14:34.057 "data_offset": 0, 00:14:34.057 "data_size": 0 00:14:34.057 }, 00:14:34.057 { 00:14:34.057 "name": "BaseBdev2", 00:14:34.057 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:34.057 "is_configured": true, 00:14:34.057 "data_offset": 2048, 00:14:34.057 "data_size": 63488 00:14:34.057 }, 00:14:34.057 { 00:14:34.057 "name": "BaseBdev3", 00:14:34.057 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:34.057 "is_configured": true, 00:14:34.057 "data_offset": 2048, 00:14:34.057 "data_size": 63488 00:14:34.057 } 00:14:34.057 ] 00:14:34.057 }' 00:14:34.057 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:34.057 11:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.624 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:34.884 [2024-05-14 11:51:01.801922] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.884 11:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.143 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:35.143 "name": "Existed_Raid", 00:14:35.143 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:35.143 "strip_size_kb": 0, 00:14:35.143 "state": "configuring", 00:14:35.143 "raid_level": "raid1", 00:14:35.143 "superblock": true, 00:14:35.143 "num_base_bdevs": 3, 00:14:35.143 "num_base_bdevs_discovered": 1, 00:14:35.143 "num_base_bdevs_operational": 3, 00:14:35.143 "base_bdevs_list": [ 00:14:35.144 { 00:14:35.144 "name": "BaseBdev1", 00:14:35.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.144 "is_configured": false, 00:14:35.144 "data_offset": 0, 00:14:35.144 "data_size": 0 00:14:35.144 }, 00:14:35.144 { 00:14:35.144 "name": null, 00:14:35.144 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:35.144 "is_configured": false, 00:14:35.144 "data_offset": 2048, 00:14:35.144 "data_size": 63488 00:14:35.144 }, 00:14:35.144 { 00:14:35.144 "name": "BaseBdev3", 00:14:35.144 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:35.144 "is_configured": true, 00:14:35.144 "data_offset": 2048, 00:14:35.144 "data_size": 63488 00:14:35.144 } 00:14:35.144 ] 00:14:35.144 }' 00:14:35.144 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:35.144 11:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.712 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.712 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:35.971 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:14:35.971 11:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:36.230 [2024-05-14 11:51:03.088906] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.230 BaseBdev1 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:36.230 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.489 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:36.749 [ 00:14:36.749 { 00:14:36.749 "name": "BaseBdev1", 00:14:36.749 "aliases": [ 00:14:36.749 "dbff1595-c413-41e5-9996-c80d1a938768" 00:14:36.749 ], 00:14:36.749 "product_name": "Malloc disk", 00:14:36.749 "block_size": 512, 00:14:36.749 "num_blocks": 65536, 00:14:36.749 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:36.749 "assigned_rate_limits": { 00:14:36.749 "rw_ios_per_sec": 0, 00:14:36.749 "rw_mbytes_per_sec": 0, 00:14:36.749 "r_mbytes_per_sec": 0, 00:14:36.749 "w_mbytes_per_sec": 0 00:14:36.749 }, 00:14:36.749 "claimed": true, 00:14:36.749 "claim_type": "exclusive_write", 00:14:36.749 "zoned": false, 00:14:36.749 "supported_io_types": { 00:14:36.749 "read": true, 00:14:36.749 "write": true, 00:14:36.749 "unmap": true, 00:14:36.749 "write_zeroes": true, 00:14:36.749 "flush": true, 00:14:36.749 "reset": true, 00:14:36.749 "compare": false, 00:14:36.749 "compare_and_write": false, 00:14:36.749 "abort": true, 00:14:36.749 "nvme_admin": false, 00:14:36.749 "nvme_io": false 00:14:36.749 }, 00:14:36.749 "memory_domains": [ 00:14:36.749 { 00:14:36.749 "dma_device_id": "system", 00:14:36.749 "dma_device_type": 1 00:14:36.749 }, 00:14:36.749 { 00:14:36.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.749 "dma_device_type": 2 00:14:36.749 } 00:14:36.749 ], 00:14:36.749 "driver_specific": {} 00:14:36.749 } 00:14:36.749 ] 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.749 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.009 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:37.009 "name": "Existed_Raid", 00:14:37.009 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:37.009 "strip_size_kb": 0, 00:14:37.009 "state": "configuring", 00:14:37.009 "raid_level": "raid1", 00:14:37.009 "superblock": true, 00:14:37.009 "num_base_bdevs": 3, 00:14:37.009 "num_base_bdevs_discovered": 2, 00:14:37.009 "num_base_bdevs_operational": 3, 00:14:37.009 "base_bdevs_list": [ 00:14:37.009 { 00:14:37.009 "name": "BaseBdev1", 00:14:37.009 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:37.009 "is_configured": true, 00:14:37.009 "data_offset": 2048, 00:14:37.009 "data_size": 63488 00:14:37.009 }, 00:14:37.009 { 00:14:37.009 "name": null, 00:14:37.009 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:37.009 "is_configured": false, 00:14:37.009 "data_offset": 2048, 00:14:37.009 "data_size": 63488 00:14:37.009 }, 00:14:37.009 { 00:14:37.009 "name": "BaseBdev3", 00:14:37.009 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:37.009 "is_configured": true, 00:14:37.009 "data_offset": 2048, 00:14:37.009 "data_size": 63488 00:14:37.009 } 00:14:37.009 ] 00:14:37.009 }' 00:14:37.009 11:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:37.009 11:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.577 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.577 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:37.836 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:37.837 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:37.837 [2024-05-14 11:51:04.913763] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.096 11:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.096 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:38.096 "name": "Existed_Raid", 00:14:38.096 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:38.096 "strip_size_kb": 0, 00:14:38.096 "state": "configuring", 00:14:38.096 "raid_level": "raid1", 00:14:38.096 "superblock": true, 00:14:38.096 "num_base_bdevs": 3, 00:14:38.096 "num_base_bdevs_discovered": 1, 00:14:38.096 "num_base_bdevs_operational": 3, 00:14:38.096 "base_bdevs_list": [ 00:14:38.096 { 00:14:38.096 "name": "BaseBdev1", 00:14:38.096 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:38.096 "is_configured": true, 00:14:38.096 "data_offset": 2048, 00:14:38.096 "data_size": 63488 00:14:38.096 }, 00:14:38.096 { 00:14:38.096 "name": null, 00:14:38.096 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:38.096 "is_configured": false, 00:14:38.096 "data_offset": 2048, 00:14:38.096 "data_size": 63488 00:14:38.096 }, 00:14:38.096 { 00:14:38.096 "name": null, 00:14:38.096 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:38.096 "is_configured": false, 00:14:38.096 "data_offset": 2048, 00:14:38.096 "data_size": 63488 00:14:38.096 } 00:14:38.096 ] 00:14:38.096 }' 00:14:38.096 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:38.096 11:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:39.084 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.084 11:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:39.084 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:39.085 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:39.381 [2024-05-14 11:51:06.245305] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.381 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.640 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:39.640 "name": "Existed_Raid", 00:14:39.640 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:39.640 "strip_size_kb": 0, 00:14:39.640 "state": "configuring", 00:14:39.640 "raid_level": "raid1", 00:14:39.640 "superblock": true, 00:14:39.640 "num_base_bdevs": 3, 00:14:39.640 "num_base_bdevs_discovered": 2, 00:14:39.640 "num_base_bdevs_operational": 3, 00:14:39.640 "base_bdevs_list": [ 00:14:39.640 { 00:14:39.640 "name": "BaseBdev1", 00:14:39.640 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:39.640 "is_configured": true, 00:14:39.641 "data_offset": 2048, 00:14:39.641 "data_size": 63488 00:14:39.641 }, 00:14:39.641 { 00:14:39.641 "name": null, 00:14:39.641 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:39.641 "is_configured": false, 00:14:39.641 "data_offset": 2048, 00:14:39.641 "data_size": 63488 00:14:39.641 }, 00:14:39.641 { 00:14:39.641 "name": "BaseBdev3", 00:14:39.641 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:39.641 "is_configured": true, 00:14:39.641 "data_offset": 2048, 00:14:39.641 "data_size": 63488 00:14:39.641 } 00:14:39.641 ] 00:14:39.641 }' 00:14:39.641 11:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:39.641 11:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.210 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.210 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:40.470 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:40.470 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:40.730 [2024-05-14 11:51:07.572843] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.730 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.990 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:40.990 "name": "Existed_Raid", 00:14:40.990 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:40.990 "strip_size_kb": 0, 00:14:40.990 "state": "configuring", 00:14:40.990 "raid_level": "raid1", 00:14:40.990 "superblock": true, 00:14:40.990 "num_base_bdevs": 3, 00:14:40.990 "num_base_bdevs_discovered": 1, 00:14:40.990 "num_base_bdevs_operational": 3, 00:14:40.990 "base_bdevs_list": [ 00:14:40.990 { 00:14:40.990 "name": null, 00:14:40.990 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:40.990 "is_configured": false, 00:14:40.990 "data_offset": 2048, 00:14:40.990 "data_size": 63488 00:14:40.990 }, 00:14:40.990 { 00:14:40.990 "name": null, 00:14:40.990 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:40.990 "is_configured": false, 00:14:40.990 "data_offset": 2048, 00:14:40.990 "data_size": 63488 00:14:40.990 }, 00:14:40.990 { 00:14:40.990 "name": "BaseBdev3", 00:14:40.990 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:40.990 "is_configured": true, 00:14:40.990 "data_offset": 2048, 00:14:40.990 "data_size": 63488 00:14:40.990 } 00:14:40.990 ] 00:14:40.990 }' 00:14:40.990 11:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:40.990 11:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:41.591 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.591 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:41.851 [2024-05-14 11:51:08.908718] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.851 11:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.111 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:42.111 "name": "Existed_Raid", 00:14:42.111 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:42.111 "strip_size_kb": 0, 00:14:42.111 "state": "configuring", 00:14:42.111 "raid_level": "raid1", 00:14:42.111 "superblock": true, 00:14:42.111 "num_base_bdevs": 3, 00:14:42.111 "num_base_bdevs_discovered": 2, 00:14:42.111 "num_base_bdevs_operational": 3, 00:14:42.111 "base_bdevs_list": [ 00:14:42.111 { 00:14:42.111 "name": null, 00:14:42.111 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:42.111 "is_configured": false, 00:14:42.111 "data_offset": 2048, 00:14:42.111 "data_size": 63488 00:14:42.111 }, 00:14:42.111 { 00:14:42.111 "name": "BaseBdev2", 00:14:42.111 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:42.111 "is_configured": true, 00:14:42.111 "data_offset": 2048, 00:14:42.111 "data_size": 63488 00:14:42.111 }, 00:14:42.111 { 00:14:42.111 "name": "BaseBdev3", 00:14:42.111 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:42.111 "is_configured": true, 00:14:42.111 "data_offset": 2048, 00:14:42.111 "data_size": 63488 00:14:42.111 } 00:14:42.111 ] 00:14:42.111 }' 00:14:42.111 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:42.111 11:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.047 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.047 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:43.047 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:43.047 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:43.047 11:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.306 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u dbff1595-c413-41e5-9996-c80d1a938768 00:14:43.306 [2024-05-14 11:51:10.367972] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:43.306 [2024-05-14 11:51:10.368123] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xd4ecc0 00:14:43.306 [2024-05-14 11:51:10.368136] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:43.306 [2024-05-14 11:51:10.368306] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd4f7c0 00:14:43.306 [2024-05-14 11:51:10.368446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd4ecc0 00:14:43.306 [2024-05-14 11:51:10.368457] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd4ecc0 00:14:43.306 [2024-05-14 11:51:10.368551] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:43.306 NewBaseBdev 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:43.564 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.565 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:43.823 [ 00:14:43.823 { 00:14:43.823 "name": "NewBaseBdev", 00:14:43.823 "aliases": [ 00:14:43.823 "dbff1595-c413-41e5-9996-c80d1a938768" 00:14:43.823 ], 00:14:43.823 "product_name": "Malloc disk", 00:14:43.823 "block_size": 512, 00:14:43.823 "num_blocks": 65536, 00:14:43.823 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:43.823 "assigned_rate_limits": { 00:14:43.823 "rw_ios_per_sec": 0, 00:14:43.823 "rw_mbytes_per_sec": 0, 00:14:43.823 "r_mbytes_per_sec": 0, 00:14:43.823 "w_mbytes_per_sec": 0 00:14:43.823 }, 00:14:43.823 "claimed": true, 00:14:43.823 "claim_type": "exclusive_write", 00:14:43.823 "zoned": false, 00:14:43.823 "supported_io_types": { 00:14:43.823 "read": true, 00:14:43.823 "write": true, 00:14:43.823 "unmap": true, 00:14:43.823 "write_zeroes": true, 00:14:43.823 "flush": true, 00:14:43.823 "reset": true, 00:14:43.823 "compare": false, 00:14:43.823 "compare_and_write": false, 00:14:43.823 "abort": true, 00:14:43.823 "nvme_admin": false, 00:14:43.823 "nvme_io": false 00:14:43.823 }, 00:14:43.823 "memory_domains": [ 00:14:43.823 { 00:14:43.823 "dma_device_id": "system", 00:14:43.823 "dma_device_type": 1 00:14:43.823 }, 00:14:43.824 { 00:14:43.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.824 "dma_device_type": 2 00:14:43.824 } 00:14:43.824 ], 00:14:43.824 "driver_specific": {} 00:14:43.824 } 00:14:43.824 ] 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.824 11:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.082 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:44.082 "name": "Existed_Raid", 00:14:44.082 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:44.082 "strip_size_kb": 0, 00:14:44.082 "state": "online", 00:14:44.082 "raid_level": "raid1", 00:14:44.082 "superblock": true, 00:14:44.082 "num_base_bdevs": 3, 00:14:44.082 "num_base_bdevs_discovered": 3, 00:14:44.082 "num_base_bdevs_operational": 3, 00:14:44.082 "base_bdevs_list": [ 00:14:44.082 { 00:14:44.082 "name": "NewBaseBdev", 00:14:44.082 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:44.082 "is_configured": true, 00:14:44.082 "data_offset": 2048, 00:14:44.082 "data_size": 63488 00:14:44.082 }, 00:14:44.082 { 00:14:44.082 "name": "BaseBdev2", 00:14:44.082 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:44.082 "is_configured": true, 00:14:44.082 "data_offset": 2048, 00:14:44.082 "data_size": 63488 00:14:44.082 }, 00:14:44.082 { 00:14:44.082 "name": "BaseBdev3", 00:14:44.082 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:44.082 "is_configured": true, 00:14:44.082 "data_offset": 2048, 00:14:44.082 "data_size": 63488 00:14:44.082 } 00:14:44.082 ] 00:14:44.082 }' 00:14:44.082 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:44.082 11:51:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:44.651 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:44.910 [2024-05-14 11:51:11.940420] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.910 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:44.910 "name": "Existed_Raid", 00:14:44.910 "aliases": [ 00:14:44.910 "d2f951f4-6ceb-4a82-a900-32ead68f93b6" 00:14:44.910 ], 00:14:44.910 "product_name": "Raid Volume", 00:14:44.910 "block_size": 512, 00:14:44.910 "num_blocks": 63488, 00:14:44.910 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:44.910 "assigned_rate_limits": { 00:14:44.910 "rw_ios_per_sec": 0, 00:14:44.910 "rw_mbytes_per_sec": 0, 00:14:44.910 "r_mbytes_per_sec": 0, 00:14:44.910 "w_mbytes_per_sec": 0 00:14:44.910 }, 00:14:44.910 "claimed": false, 00:14:44.910 "zoned": false, 00:14:44.910 "supported_io_types": { 00:14:44.910 "read": true, 00:14:44.910 "write": true, 00:14:44.910 "unmap": false, 00:14:44.910 "write_zeroes": true, 00:14:44.910 "flush": false, 00:14:44.910 "reset": true, 00:14:44.910 "compare": false, 00:14:44.910 "compare_and_write": false, 00:14:44.910 "abort": false, 00:14:44.910 "nvme_admin": false, 00:14:44.910 "nvme_io": false 00:14:44.910 }, 00:14:44.910 "memory_domains": [ 00:14:44.910 { 00:14:44.910 "dma_device_id": "system", 00:14:44.910 "dma_device_type": 1 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.910 "dma_device_type": 2 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "dma_device_id": "system", 00:14:44.910 "dma_device_type": 1 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.910 "dma_device_type": 2 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "dma_device_id": "system", 00:14:44.910 "dma_device_type": 1 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.910 "dma_device_type": 2 00:14:44.910 } 00:14:44.910 ], 00:14:44.910 "driver_specific": { 00:14:44.910 "raid": { 00:14:44.910 "uuid": "d2f951f4-6ceb-4a82-a900-32ead68f93b6", 00:14:44.910 "strip_size_kb": 0, 00:14:44.910 "state": "online", 00:14:44.910 "raid_level": "raid1", 00:14:44.910 "superblock": true, 00:14:44.910 "num_base_bdevs": 3, 00:14:44.910 "num_base_bdevs_discovered": 3, 00:14:44.910 "num_base_bdevs_operational": 3, 00:14:44.910 "base_bdevs_list": [ 00:14:44.910 { 00:14:44.910 "name": "NewBaseBdev", 00:14:44.910 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:44.910 "is_configured": true, 00:14:44.910 "data_offset": 2048, 00:14:44.910 "data_size": 63488 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "name": "BaseBdev2", 00:14:44.910 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:44.910 "is_configured": true, 00:14:44.910 "data_offset": 2048, 00:14:44.910 "data_size": 63488 00:14:44.910 }, 00:14:44.910 { 00:14:44.910 "name": "BaseBdev3", 00:14:44.910 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:44.910 "is_configured": true, 00:14:44.910 "data_offset": 2048, 00:14:44.910 "data_size": 63488 00:14:44.910 } 00:14:44.910 ] 00:14:44.910 } 00:14:44.910 } 00:14:44.910 }' 00:14:44.910 11:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:45.169 BaseBdev2 00:14:45.169 BaseBdev3' 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:45.169 "name": "NewBaseBdev", 00:14:45.169 "aliases": [ 00:14:45.169 "dbff1595-c413-41e5-9996-c80d1a938768" 00:14:45.169 ], 00:14:45.169 "product_name": "Malloc disk", 00:14:45.169 "block_size": 512, 00:14:45.169 "num_blocks": 65536, 00:14:45.169 "uuid": "dbff1595-c413-41e5-9996-c80d1a938768", 00:14:45.169 "assigned_rate_limits": { 00:14:45.169 "rw_ios_per_sec": 0, 00:14:45.169 "rw_mbytes_per_sec": 0, 00:14:45.169 "r_mbytes_per_sec": 0, 00:14:45.169 "w_mbytes_per_sec": 0 00:14:45.169 }, 00:14:45.169 "claimed": true, 00:14:45.169 "claim_type": "exclusive_write", 00:14:45.169 "zoned": false, 00:14:45.169 "supported_io_types": { 00:14:45.169 "read": true, 00:14:45.169 "write": true, 00:14:45.169 "unmap": true, 00:14:45.169 "write_zeroes": true, 00:14:45.169 "flush": true, 00:14:45.169 "reset": true, 00:14:45.169 "compare": false, 00:14:45.169 "compare_and_write": false, 00:14:45.169 "abort": true, 00:14:45.169 "nvme_admin": false, 00:14:45.169 "nvme_io": false 00:14:45.169 }, 00:14:45.169 "memory_domains": [ 00:14:45.169 { 00:14:45.169 "dma_device_id": "system", 00:14:45.169 "dma_device_type": 1 00:14:45.169 }, 00:14:45.169 { 00:14:45.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.169 "dma_device_type": 2 00:14:45.169 } 00:14:45.169 ], 00:14:45.169 "driver_specific": {} 00:14:45.169 }' 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:45.169 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:45.432 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:45.691 "name": "BaseBdev2", 00:14:45.691 "aliases": [ 00:14:45.691 "06202232-28a7-4497-9f10-e93fc0c589a4" 00:14:45.691 ], 00:14:45.691 "product_name": "Malloc disk", 00:14:45.691 "block_size": 512, 00:14:45.691 "num_blocks": 65536, 00:14:45.691 "uuid": "06202232-28a7-4497-9f10-e93fc0c589a4", 00:14:45.691 "assigned_rate_limits": { 00:14:45.691 "rw_ios_per_sec": 0, 00:14:45.691 "rw_mbytes_per_sec": 0, 00:14:45.691 "r_mbytes_per_sec": 0, 00:14:45.691 "w_mbytes_per_sec": 0 00:14:45.691 }, 00:14:45.691 "claimed": true, 00:14:45.691 "claim_type": "exclusive_write", 00:14:45.691 "zoned": false, 00:14:45.691 "supported_io_types": { 00:14:45.691 "read": true, 00:14:45.691 "write": true, 00:14:45.691 "unmap": true, 00:14:45.691 "write_zeroes": true, 00:14:45.691 "flush": true, 00:14:45.691 "reset": true, 00:14:45.691 "compare": false, 00:14:45.691 "compare_and_write": false, 00:14:45.691 "abort": true, 00:14:45.691 "nvme_admin": false, 00:14:45.691 "nvme_io": false 00:14:45.691 }, 00:14:45.691 "memory_domains": [ 00:14:45.691 { 00:14:45.691 "dma_device_id": "system", 00:14:45.691 "dma_device_type": 1 00:14:45.691 }, 00:14:45.691 { 00:14:45.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.691 "dma_device_type": 2 00:14:45.691 } 00:14:45.691 ], 00:14:45.691 "driver_specific": {} 00:14:45.691 }' 00:14:45.691 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:45.950 11:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:45.950 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.950 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:46.209 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:46.209 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:46.209 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:46.209 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:46.209 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:46.469 "name": "BaseBdev3", 00:14:46.469 "aliases": [ 00:14:46.469 "e4899f82-c194-4298-8f9c-afda195e4c9e" 00:14:46.469 ], 00:14:46.469 "product_name": "Malloc disk", 00:14:46.469 "block_size": 512, 00:14:46.469 "num_blocks": 65536, 00:14:46.469 "uuid": "e4899f82-c194-4298-8f9c-afda195e4c9e", 00:14:46.469 "assigned_rate_limits": { 00:14:46.469 "rw_ios_per_sec": 0, 00:14:46.469 "rw_mbytes_per_sec": 0, 00:14:46.469 "r_mbytes_per_sec": 0, 00:14:46.469 "w_mbytes_per_sec": 0 00:14:46.469 }, 00:14:46.469 "claimed": true, 00:14:46.469 "claim_type": "exclusive_write", 00:14:46.469 "zoned": false, 00:14:46.469 "supported_io_types": { 00:14:46.469 "read": true, 00:14:46.469 "write": true, 00:14:46.469 "unmap": true, 00:14:46.469 "write_zeroes": true, 00:14:46.469 "flush": true, 00:14:46.469 "reset": true, 00:14:46.469 "compare": false, 00:14:46.469 "compare_and_write": false, 00:14:46.469 "abort": true, 00:14:46.469 "nvme_admin": false, 00:14:46.469 "nvme_io": false 00:14:46.469 }, 00:14:46.469 "memory_domains": [ 00:14:46.469 { 00:14:46.469 "dma_device_id": "system", 00:14:46.469 "dma_device_type": 1 00:14:46.469 }, 00:14:46.469 { 00:14:46.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.469 "dma_device_type": 2 00:14:46.469 } 00:14:46.469 ], 00:14:46.469 "driver_specific": {} 00:14:46.469 }' 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:46.469 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:46.729 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:46.729 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:46.729 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:46.729 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:46.729 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:46.989 [2024-05-14 11:51:13.917429] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:46.989 [2024-05-14 11:51:13.917454] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:46.989 [2024-05-14 11:51:13.917510] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:46.989 [2024-05-14 11:51:13.917777] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:46.989 [2024-05-14 11:51:13.917791] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd4ecc0 name Existed_Raid, state offline 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1704030 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1704030 ']' 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1704030 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1704030 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1704030' 00:14:46.989 killing process with pid 1704030 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1704030 00:14:46.989 [2024-05-14 11:51:13.991355] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:46.989 11:51:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1704030 00:14:46.989 [2024-05-14 11:51:14.018865] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:47.249 11:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:14:47.249 00:14:47.249 real 0m28.103s 00:14:47.249 user 0m51.594s 00:14:47.249 sys 0m4.984s 00:14:47.249 11:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:47.249 11:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.249 ************************************ 00:14:47.249 END TEST raid_state_function_test_sb 00:14:47.249 ************************************ 00:14:47.249 11:51:14 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:14:47.249 11:51:14 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:14:47.249 11:51:14 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:47.249 11:51:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:47.249 ************************************ 00:14:47.249 START TEST raid_superblock_test 00:14:47.249 ************************************ 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 3 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1708833 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1708833 /var/tmp/spdk-raid.sock 00:14:47.508 11:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1708833 ']' 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:47.509 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:47.509 11:51:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.509 [2024-05-14 11:51:14.397072] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:14:47.509 [2024-05-14 11:51:14.397141] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708833 ] 00:14:47.509 [2024-05-14 11:51:14.517405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:47.768 [2024-05-14 11:51:14.623341] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:47.768 [2024-05-14 11:51:14.695229] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:47.768 [2024-05-14 11:51:14.695270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:48.335 11:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:48.336 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:48.595 malloc1 00:14:48.595 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:48.854 [2024-05-14 11:51:15.737898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:48.854 [2024-05-14 11:51:15.737946] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:48.854 [2024-05-14 11:51:15.737971] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21db2a0 00:14:48.854 [2024-05-14 11:51:15.737984] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:48.854 [2024-05-14 11:51:15.739548] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:48.854 [2024-05-14 11:51:15.739575] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:48.854 pt1 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:48.854 11:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:49.113 malloc2 00:14:49.113 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:49.372 [2024-05-14 11:51:16.235967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:49.372 [2024-05-14 11:51:16.236007] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.372 [2024-05-14 11:51:16.236027] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238e480 00:14:49.372 [2024-05-14 11:51:16.236039] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.372 [2024-05-14 11:51:16.237428] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.372 [2024-05-14 11:51:16.237455] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:49.372 pt2 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:49.372 malloc3 00:14:49.372 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:49.631 [2024-05-14 11:51:16.658966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:49.631 [2024-05-14 11:51:16.659017] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.631 [2024-05-14 11:51:16.659035] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d4e80 00:14:49.631 [2024-05-14 11:51:16.659048] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.631 [2024-05-14 11:51:16.660516] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.631 [2024-05-14 11:51:16.660549] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:49.631 pt3 00:14:49.631 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:49.631 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:49.631 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:49.891 [2024-05-14 11:51:16.907642] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:49.891 [2024-05-14 11:51:16.908943] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:49.891 [2024-05-14 11:51:16.908997] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:49.891 [2024-05-14 11:51:16.909151] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d6dc0 00:14:49.891 [2024-05-14 11:51:16.909163] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:49.891 [2024-05-14 11:51:16.909363] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21dc480 00:14:49.891 [2024-05-14 11:51:16.909535] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d6dc0 00:14:49.891 [2024-05-14 11:51:16.909546] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d6dc0 00:14:49.891 [2024-05-14 11:51:16.909643] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.891 11:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:50.150 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:50.150 "name": "raid_bdev1", 00:14:50.150 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:50.150 "strip_size_kb": 0, 00:14:50.150 "state": "online", 00:14:50.150 "raid_level": "raid1", 00:14:50.150 "superblock": true, 00:14:50.150 "num_base_bdevs": 3, 00:14:50.150 "num_base_bdevs_discovered": 3, 00:14:50.150 "num_base_bdevs_operational": 3, 00:14:50.150 "base_bdevs_list": [ 00:14:50.150 { 00:14:50.150 "name": "pt1", 00:14:50.150 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:50.150 "is_configured": true, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 }, 00:14:50.150 { 00:14:50.150 "name": "pt2", 00:14:50.150 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:50.150 "is_configured": true, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 }, 00:14:50.150 { 00:14:50.150 "name": "pt3", 00:14:50.150 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:50.150 "is_configured": true, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 } 00:14:50.150 ] 00:14:50.150 }' 00:14:50.150 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:50.150 11:51:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:50.719 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:50.978 [2024-05-14 11:51:17.818282] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:50.978 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:50.978 "name": "raid_bdev1", 00:14:50.978 "aliases": [ 00:14:50.978 "1709ffb9-5a15-4d8f-a496-dfadb01e8816" 00:14:50.978 ], 00:14:50.978 "product_name": "Raid Volume", 00:14:50.978 "block_size": 512, 00:14:50.978 "num_blocks": 63488, 00:14:50.978 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:50.978 "assigned_rate_limits": { 00:14:50.978 "rw_ios_per_sec": 0, 00:14:50.978 "rw_mbytes_per_sec": 0, 00:14:50.978 "r_mbytes_per_sec": 0, 00:14:50.978 "w_mbytes_per_sec": 0 00:14:50.978 }, 00:14:50.978 "claimed": false, 00:14:50.978 "zoned": false, 00:14:50.978 "supported_io_types": { 00:14:50.978 "read": true, 00:14:50.978 "write": true, 00:14:50.978 "unmap": false, 00:14:50.978 "write_zeroes": true, 00:14:50.978 "flush": false, 00:14:50.978 "reset": true, 00:14:50.978 "compare": false, 00:14:50.978 "compare_and_write": false, 00:14:50.978 "abort": false, 00:14:50.978 "nvme_admin": false, 00:14:50.978 "nvme_io": false 00:14:50.978 }, 00:14:50.978 "memory_domains": [ 00:14:50.978 { 00:14:50.978 "dma_device_id": "system", 00:14:50.978 "dma_device_type": 1 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.978 "dma_device_type": 2 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "dma_device_id": "system", 00:14:50.978 "dma_device_type": 1 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.978 "dma_device_type": 2 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "dma_device_id": "system", 00:14:50.978 "dma_device_type": 1 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.978 "dma_device_type": 2 00:14:50.978 } 00:14:50.978 ], 00:14:50.978 "driver_specific": { 00:14:50.978 "raid": { 00:14:50.978 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:50.978 "strip_size_kb": 0, 00:14:50.978 "state": "online", 00:14:50.978 "raid_level": "raid1", 00:14:50.978 "superblock": true, 00:14:50.978 "num_base_bdevs": 3, 00:14:50.978 "num_base_bdevs_discovered": 3, 00:14:50.978 "num_base_bdevs_operational": 3, 00:14:50.978 "base_bdevs_list": [ 00:14:50.978 { 00:14:50.978 "name": "pt1", 00:14:50.978 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:50.978 "is_configured": true, 00:14:50.978 "data_offset": 2048, 00:14:50.978 "data_size": 63488 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "name": "pt2", 00:14:50.978 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:50.978 "is_configured": true, 00:14:50.978 "data_offset": 2048, 00:14:50.978 "data_size": 63488 00:14:50.978 }, 00:14:50.978 { 00:14:50.978 "name": "pt3", 00:14:50.978 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:50.978 "is_configured": true, 00:14:50.978 "data_offset": 2048, 00:14:50.978 "data_size": 63488 00:14:50.978 } 00:14:50.978 ] 00:14:50.978 } 00:14:50.978 } 00:14:50.978 }' 00:14:50.978 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:50.978 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:14:50.978 pt2 00:14:50.978 pt3' 00:14:50.979 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:50.979 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:50.979 11:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:51.238 "name": "pt1", 00:14:51.238 "aliases": [ 00:14:51.238 "7a138f98-1976-5afe-be81-6fda481a3e93" 00:14:51.238 ], 00:14:51.238 "product_name": "passthru", 00:14:51.238 "block_size": 512, 00:14:51.238 "num_blocks": 65536, 00:14:51.238 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:51.238 "assigned_rate_limits": { 00:14:51.238 "rw_ios_per_sec": 0, 00:14:51.238 "rw_mbytes_per_sec": 0, 00:14:51.238 "r_mbytes_per_sec": 0, 00:14:51.238 "w_mbytes_per_sec": 0 00:14:51.238 }, 00:14:51.238 "claimed": true, 00:14:51.238 "claim_type": "exclusive_write", 00:14:51.238 "zoned": false, 00:14:51.238 "supported_io_types": { 00:14:51.238 "read": true, 00:14:51.238 "write": true, 00:14:51.238 "unmap": true, 00:14:51.238 "write_zeroes": true, 00:14:51.238 "flush": true, 00:14:51.238 "reset": true, 00:14:51.238 "compare": false, 00:14:51.238 "compare_and_write": false, 00:14:51.238 "abort": true, 00:14:51.238 "nvme_admin": false, 00:14:51.238 "nvme_io": false 00:14:51.238 }, 00:14:51.238 "memory_domains": [ 00:14:51.238 { 00:14:51.238 "dma_device_id": "system", 00:14:51.238 "dma_device_type": 1 00:14:51.238 }, 00:14:51.238 { 00:14:51.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.238 "dma_device_type": 2 00:14:51.238 } 00:14:51.238 ], 00:14:51.238 "driver_specific": { 00:14:51.238 "passthru": { 00:14:51.238 "name": "pt1", 00:14:51.238 "base_bdev_name": "malloc1" 00:14:51.238 } 00:14:51.238 } 00:14:51.238 }' 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.238 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:51.497 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:51.498 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:51.498 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:51.757 "name": "pt2", 00:14:51.757 "aliases": [ 00:14:51.757 "81eda2c3-8b13-53bc-b2da-bb908b6804ca" 00:14:51.757 ], 00:14:51.757 "product_name": "passthru", 00:14:51.757 "block_size": 512, 00:14:51.757 "num_blocks": 65536, 00:14:51.757 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:51.757 "assigned_rate_limits": { 00:14:51.757 "rw_ios_per_sec": 0, 00:14:51.757 "rw_mbytes_per_sec": 0, 00:14:51.757 "r_mbytes_per_sec": 0, 00:14:51.757 "w_mbytes_per_sec": 0 00:14:51.757 }, 00:14:51.757 "claimed": true, 00:14:51.757 "claim_type": "exclusive_write", 00:14:51.757 "zoned": false, 00:14:51.757 "supported_io_types": { 00:14:51.757 "read": true, 00:14:51.757 "write": true, 00:14:51.757 "unmap": true, 00:14:51.757 "write_zeroes": true, 00:14:51.757 "flush": true, 00:14:51.757 "reset": true, 00:14:51.757 "compare": false, 00:14:51.757 "compare_and_write": false, 00:14:51.757 "abort": true, 00:14:51.757 "nvme_admin": false, 00:14:51.757 "nvme_io": false 00:14:51.757 }, 00:14:51.757 "memory_domains": [ 00:14:51.757 { 00:14:51.757 "dma_device_id": "system", 00:14:51.757 "dma_device_type": 1 00:14:51.757 }, 00:14:51.757 { 00:14:51.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.757 "dma_device_type": 2 00:14:51.757 } 00:14:51.757 ], 00:14:51.757 "driver_specific": { 00:14:51.757 "passthru": { 00:14:51.757 "name": "pt2", 00:14:51.757 "base_bdev_name": "malloc2" 00:14:51.757 } 00:14:51.757 } 00:14:51.757 }' 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:51.757 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:52.015 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.015 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:52.015 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:52.015 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.015 11:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:52.015 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:52.015 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:52.015 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:52.015 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:52.015 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:52.274 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:52.274 "name": "pt3", 00:14:52.274 "aliases": [ 00:14:52.274 "838c032d-6616-5a1e-8321-cc66aece320e" 00:14:52.274 ], 00:14:52.274 "product_name": "passthru", 00:14:52.274 "block_size": 512, 00:14:52.274 "num_blocks": 65536, 00:14:52.274 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:52.274 "assigned_rate_limits": { 00:14:52.274 "rw_ios_per_sec": 0, 00:14:52.274 "rw_mbytes_per_sec": 0, 00:14:52.274 "r_mbytes_per_sec": 0, 00:14:52.274 "w_mbytes_per_sec": 0 00:14:52.274 }, 00:14:52.274 "claimed": true, 00:14:52.274 "claim_type": "exclusive_write", 00:14:52.274 "zoned": false, 00:14:52.274 "supported_io_types": { 00:14:52.274 "read": true, 00:14:52.274 "write": true, 00:14:52.274 "unmap": true, 00:14:52.274 "write_zeroes": true, 00:14:52.274 "flush": true, 00:14:52.274 "reset": true, 00:14:52.274 "compare": false, 00:14:52.274 "compare_and_write": false, 00:14:52.274 "abort": true, 00:14:52.274 "nvme_admin": false, 00:14:52.274 "nvme_io": false 00:14:52.274 }, 00:14:52.274 "memory_domains": [ 00:14:52.274 { 00:14:52.274 "dma_device_id": "system", 00:14:52.274 "dma_device_type": 1 00:14:52.274 }, 00:14:52.274 { 00:14:52.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.274 "dma_device_type": 2 00:14:52.274 } 00:14:52.274 ], 00:14:52.274 "driver_specific": { 00:14:52.274 "passthru": { 00:14:52.274 "name": "pt3", 00:14:52.274 "base_bdev_name": "malloc3" 00:14:52.274 } 00:14:52.274 } 00:14:52.274 }' 00:14:52.275 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:52.275 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:52.533 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:14:52.791 [2024-05-14 11:51:19.855663] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=1709ffb9-5a15-4d8f-a496-dfadb01e8816 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 1709ffb9-5a15-4d8f-a496-dfadb01e8816 ']' 00:14:52.791 11:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:53.053 [2024-05-14 11:51:20.096063] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:53.053 [2024-05-14 11:51:20.096092] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:53.053 [2024-05-14 11:51:20.096146] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:53.053 [2024-05-14 11:51:20.096220] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:53.053 [2024-05-14 11:51:20.096233] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d6dc0 name raid_bdev1, state offline 00:14:53.053 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:14:53.053 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.361 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:14:53.361 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:14:53.361 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:53.361 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:53.619 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:53.619 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:53.877 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:14:53.877 11:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:54.134 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:54.134 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:54.392 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:54.651 [2024-05-14 11:51:21.555862] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:54.651 [2024-05-14 11:51:21.557214] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:54.651 [2024-05-14 11:51:21.557260] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:54.651 [2024-05-14 11:51:21.557306] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:54.651 [2024-05-14 11:51:21.557346] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:54.651 [2024-05-14 11:51:21.557368] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:54.651 [2024-05-14 11:51:21.557391] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:54.651 [2024-05-14 11:51:21.557409] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d2420 name raid_bdev1, state configuring 00:14:54.651 request: 00:14:54.651 { 00:14:54.651 "name": "raid_bdev1", 00:14:54.651 "raid_level": "raid1", 00:14:54.651 "base_bdevs": [ 00:14:54.651 "malloc1", 00:14:54.651 "malloc2", 00:14:54.651 "malloc3" 00:14:54.651 ], 00:14:54.651 "superblock": false, 00:14:54.651 "method": "bdev_raid_create", 00:14:54.651 "req_id": 1 00:14:54.651 } 00:14:54.651 Got JSON-RPC error response 00:14:54.651 response: 00:14:54.651 { 00:14:54.651 "code": -17, 00:14:54.651 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:54.651 } 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.651 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:14:54.909 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:14:54.909 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:14:54.909 11:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:55.167 [2024-05-14 11:51:22.033072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:55.167 [2024-05-14 11:51:22.033118] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.167 [2024-05-14 11:51:22.033141] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2384060 00:14:55.167 [2024-05-14 11:51:22.033153] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.167 [2024-05-14 11:51:22.034821] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.167 [2024-05-14 11:51:22.034850] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:55.167 [2024-05-14 11:51:22.034919] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:14:55.167 [2024-05-14 11:51:22.034947] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:55.167 pt1 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.167 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:55.426 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:55.426 "name": "raid_bdev1", 00:14:55.426 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:55.426 "strip_size_kb": 0, 00:14:55.426 "state": "configuring", 00:14:55.426 "raid_level": "raid1", 00:14:55.426 "superblock": true, 00:14:55.426 "num_base_bdevs": 3, 00:14:55.426 "num_base_bdevs_discovered": 1, 00:14:55.426 "num_base_bdevs_operational": 3, 00:14:55.426 "base_bdevs_list": [ 00:14:55.426 { 00:14:55.426 "name": "pt1", 00:14:55.426 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:55.426 "is_configured": true, 00:14:55.426 "data_offset": 2048, 00:14:55.426 "data_size": 63488 00:14:55.426 }, 00:14:55.426 { 00:14:55.426 "name": null, 00:14:55.426 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:55.426 "is_configured": false, 00:14:55.426 "data_offset": 2048, 00:14:55.426 "data_size": 63488 00:14:55.426 }, 00:14:55.426 { 00:14:55.426 "name": null, 00:14:55.426 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:55.426 "is_configured": false, 00:14:55.426 "data_offset": 2048, 00:14:55.426 "data_size": 63488 00:14:55.426 } 00:14:55.426 ] 00:14:55.426 }' 00:14:55.426 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:55.426 11:51:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.991 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:14:55.991 11:51:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:56.250 [2024-05-14 11:51:23.103932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:56.250 [2024-05-14 11:51:23.103987] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.250 [2024-05-14 11:51:23.104007] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d3530 00:14:56.250 [2024-05-14 11:51:23.104019] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.250 [2024-05-14 11:51:23.104376] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.250 [2024-05-14 11:51:23.104393] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:56.250 [2024-05-14 11:51:23.104469] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:14:56.250 [2024-05-14 11:51:23.104489] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:56.250 pt2 00:14:56.250 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:56.509 [2024-05-14 11:51:23.336569] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.509 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:56.767 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:56.767 "name": "raid_bdev1", 00:14:56.767 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:56.767 "strip_size_kb": 0, 00:14:56.767 "state": "configuring", 00:14:56.767 "raid_level": "raid1", 00:14:56.767 "superblock": true, 00:14:56.767 "num_base_bdevs": 3, 00:14:56.767 "num_base_bdevs_discovered": 1, 00:14:56.767 "num_base_bdevs_operational": 3, 00:14:56.767 "base_bdevs_list": [ 00:14:56.767 { 00:14:56.767 "name": "pt1", 00:14:56.767 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:56.767 "is_configured": true, 00:14:56.767 "data_offset": 2048, 00:14:56.767 "data_size": 63488 00:14:56.767 }, 00:14:56.767 { 00:14:56.767 "name": null, 00:14:56.767 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:56.767 "is_configured": false, 00:14:56.767 "data_offset": 2048, 00:14:56.767 "data_size": 63488 00:14:56.767 }, 00:14:56.767 { 00:14:56.767 "name": null, 00:14:56.767 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:56.767 "is_configured": false, 00:14:56.767 "data_offset": 2048, 00:14:56.767 "data_size": 63488 00:14:56.767 } 00:14:56.767 ] 00:14:56.767 }' 00:14:56.767 11:51:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:56.767 11:51:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:57.334 [2024-05-14 11:51:24.395360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:57.334 [2024-05-14 11:51:24.395420] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.334 [2024-05-14 11:51:24.395439] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d3760 00:14:57.334 [2024-05-14 11:51:24.395451] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.334 [2024-05-14 11:51:24.395805] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.334 [2024-05-14 11:51:24.395823] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:57.334 [2024-05-14 11:51:24.395888] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:14:57.334 [2024-05-14 11:51:24.395907] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:57.334 pt2 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:57.334 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:57.591 [2024-05-14 11:51:24.635997] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:57.591 [2024-05-14 11:51:24.636039] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.591 [2024-05-14 11:51:24.636057] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21dc5c0 00:14:57.591 [2024-05-14 11:51:24.636069] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.591 [2024-05-14 11:51:24.636407] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.591 [2024-05-14 11:51:24.636424] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:57.591 [2024-05-14 11:51:24.636483] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:14:57.591 [2024-05-14 11:51:24.636501] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:57.591 [2024-05-14 11:51:24.636618] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d7490 00:14:57.591 [2024-05-14 11:51:24.636629] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:57.591 [2024-05-14 11:51:24.636799] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d9e40 00:14:57.591 [2024-05-14 11:51:24.636933] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d7490 00:14:57.591 [2024-05-14 11:51:24.636943] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d7490 00:14:57.591 [2024-05-14 11:51:24.637040] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:57.591 pt3 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:57.591 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:57.592 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:57.592 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:57.592 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.592 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:57.850 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:57.850 "name": "raid_bdev1", 00:14:57.850 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:57.850 "strip_size_kb": 0, 00:14:57.850 "state": "online", 00:14:57.850 "raid_level": "raid1", 00:14:57.850 "superblock": true, 00:14:57.850 "num_base_bdevs": 3, 00:14:57.850 "num_base_bdevs_discovered": 3, 00:14:57.850 "num_base_bdevs_operational": 3, 00:14:57.850 "base_bdevs_list": [ 00:14:57.850 { 00:14:57.850 "name": "pt1", 00:14:57.850 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:57.850 "is_configured": true, 00:14:57.850 "data_offset": 2048, 00:14:57.850 "data_size": 63488 00:14:57.850 }, 00:14:57.850 { 00:14:57.850 "name": "pt2", 00:14:57.850 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:57.850 "is_configured": true, 00:14:57.850 "data_offset": 2048, 00:14:57.850 "data_size": 63488 00:14:57.850 }, 00:14:57.850 { 00:14:57.850 "name": "pt3", 00:14:57.850 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:57.850 "is_configured": true, 00:14:57.850 "data_offset": 2048, 00:14:57.850 "data_size": 63488 00:14:57.850 } 00:14:57.850 ] 00:14:57.850 }' 00:14:57.850 11:51:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:57.850 11:51:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:58.417 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:58.676 [2024-05-14 11:51:25.707085] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:58.676 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:58.676 "name": "raid_bdev1", 00:14:58.676 "aliases": [ 00:14:58.676 "1709ffb9-5a15-4d8f-a496-dfadb01e8816" 00:14:58.676 ], 00:14:58.676 "product_name": "Raid Volume", 00:14:58.676 "block_size": 512, 00:14:58.676 "num_blocks": 63488, 00:14:58.676 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:58.676 "assigned_rate_limits": { 00:14:58.676 "rw_ios_per_sec": 0, 00:14:58.676 "rw_mbytes_per_sec": 0, 00:14:58.676 "r_mbytes_per_sec": 0, 00:14:58.676 "w_mbytes_per_sec": 0 00:14:58.676 }, 00:14:58.676 "claimed": false, 00:14:58.676 "zoned": false, 00:14:58.676 "supported_io_types": { 00:14:58.676 "read": true, 00:14:58.676 "write": true, 00:14:58.676 "unmap": false, 00:14:58.676 "write_zeroes": true, 00:14:58.676 "flush": false, 00:14:58.676 "reset": true, 00:14:58.676 "compare": false, 00:14:58.676 "compare_and_write": false, 00:14:58.676 "abort": false, 00:14:58.676 "nvme_admin": false, 00:14:58.676 "nvme_io": false 00:14:58.676 }, 00:14:58.676 "memory_domains": [ 00:14:58.676 { 00:14:58.676 "dma_device_id": "system", 00:14:58.676 "dma_device_type": 1 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.676 "dma_device_type": 2 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "dma_device_id": "system", 00:14:58.676 "dma_device_type": 1 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.676 "dma_device_type": 2 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "dma_device_id": "system", 00:14:58.676 "dma_device_type": 1 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.676 "dma_device_type": 2 00:14:58.676 } 00:14:58.676 ], 00:14:58.676 "driver_specific": { 00:14:58.676 "raid": { 00:14:58.676 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:14:58.676 "strip_size_kb": 0, 00:14:58.676 "state": "online", 00:14:58.676 "raid_level": "raid1", 00:14:58.676 "superblock": true, 00:14:58.676 "num_base_bdevs": 3, 00:14:58.676 "num_base_bdevs_discovered": 3, 00:14:58.676 "num_base_bdevs_operational": 3, 00:14:58.676 "base_bdevs_list": [ 00:14:58.676 { 00:14:58.676 "name": "pt1", 00:14:58.676 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:58.676 "is_configured": true, 00:14:58.676 "data_offset": 2048, 00:14:58.676 "data_size": 63488 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "name": "pt2", 00:14:58.676 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:58.676 "is_configured": true, 00:14:58.676 "data_offset": 2048, 00:14:58.676 "data_size": 63488 00:14:58.676 }, 00:14:58.676 { 00:14:58.676 "name": "pt3", 00:14:58.676 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:14:58.676 "is_configured": true, 00:14:58.676 "data_offset": 2048, 00:14:58.676 "data_size": 63488 00:14:58.676 } 00:14:58.676 ] 00:14:58.676 } 00:14:58.676 } 00:14:58.676 }' 00:14:58.676 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:58.936 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:14:58.936 pt2 00:14:58.936 pt3' 00:14:58.936 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:58.936 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:58.936 11:51:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:58.936 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:58.936 "name": "pt1", 00:14:58.936 "aliases": [ 00:14:58.936 "7a138f98-1976-5afe-be81-6fda481a3e93" 00:14:58.936 ], 00:14:58.936 "product_name": "passthru", 00:14:58.936 "block_size": 512, 00:14:58.936 "num_blocks": 65536, 00:14:58.936 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:14:58.936 "assigned_rate_limits": { 00:14:58.936 "rw_ios_per_sec": 0, 00:14:58.936 "rw_mbytes_per_sec": 0, 00:14:58.936 "r_mbytes_per_sec": 0, 00:14:58.936 "w_mbytes_per_sec": 0 00:14:58.936 }, 00:14:58.936 "claimed": true, 00:14:58.936 "claim_type": "exclusive_write", 00:14:58.936 "zoned": false, 00:14:58.936 "supported_io_types": { 00:14:58.936 "read": true, 00:14:58.936 "write": true, 00:14:58.936 "unmap": true, 00:14:58.936 "write_zeroes": true, 00:14:58.936 "flush": true, 00:14:58.936 "reset": true, 00:14:58.936 "compare": false, 00:14:58.936 "compare_and_write": false, 00:14:58.936 "abort": true, 00:14:58.936 "nvme_admin": false, 00:14:58.936 "nvme_io": false 00:14:58.936 }, 00:14:58.936 "memory_domains": [ 00:14:58.936 { 00:14:58.936 "dma_device_id": "system", 00:14:58.936 "dma_device_type": 1 00:14:58.936 }, 00:14:58.936 { 00:14:58.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.936 "dma_device_type": 2 00:14:58.936 } 00:14:58.936 ], 00:14:58.936 "driver_specific": { 00:14:58.936 "passthru": { 00:14:58.936 "name": "pt1", 00:14:58.936 "base_bdev_name": "malloc1" 00:14:58.936 } 00:14:58.936 } 00:14:58.936 }' 00:14:58.936 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.195 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:59.454 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:59.454 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:59.454 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:59.454 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:59.454 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:59.713 "name": "pt2", 00:14:59.713 "aliases": [ 00:14:59.713 "81eda2c3-8b13-53bc-b2da-bb908b6804ca" 00:14:59.713 ], 00:14:59.713 "product_name": "passthru", 00:14:59.713 "block_size": 512, 00:14:59.713 "num_blocks": 65536, 00:14:59.713 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:14:59.713 "assigned_rate_limits": { 00:14:59.713 "rw_ios_per_sec": 0, 00:14:59.713 "rw_mbytes_per_sec": 0, 00:14:59.713 "r_mbytes_per_sec": 0, 00:14:59.713 "w_mbytes_per_sec": 0 00:14:59.713 }, 00:14:59.713 "claimed": true, 00:14:59.713 "claim_type": "exclusive_write", 00:14:59.713 "zoned": false, 00:14:59.713 "supported_io_types": { 00:14:59.713 "read": true, 00:14:59.713 "write": true, 00:14:59.713 "unmap": true, 00:14:59.713 "write_zeroes": true, 00:14:59.713 "flush": true, 00:14:59.713 "reset": true, 00:14:59.713 "compare": false, 00:14:59.713 "compare_and_write": false, 00:14:59.713 "abort": true, 00:14:59.713 "nvme_admin": false, 00:14:59.713 "nvme_io": false 00:14:59.713 }, 00:14:59.713 "memory_domains": [ 00:14:59.713 { 00:14:59.713 "dma_device_id": "system", 00:14:59.713 "dma_device_type": 1 00:14:59.713 }, 00:14:59.713 { 00:14:59.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.713 "dma_device_type": 2 00:14:59.713 } 00:14:59.713 ], 00:14:59.713 "driver_specific": { 00:14:59.713 "passthru": { 00:14:59.713 "name": "pt2", 00:14:59.713 "base_bdev_name": "malloc2" 00:14:59.713 } 00:14:59.713 } 00:14:59.713 }' 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.713 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:59.972 11:51:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:00.231 "name": "pt3", 00:15:00.231 "aliases": [ 00:15:00.231 "838c032d-6616-5a1e-8321-cc66aece320e" 00:15:00.231 ], 00:15:00.231 "product_name": "passthru", 00:15:00.231 "block_size": 512, 00:15:00.231 "num_blocks": 65536, 00:15:00.231 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:00.231 "assigned_rate_limits": { 00:15:00.231 "rw_ios_per_sec": 0, 00:15:00.231 "rw_mbytes_per_sec": 0, 00:15:00.231 "r_mbytes_per_sec": 0, 00:15:00.231 "w_mbytes_per_sec": 0 00:15:00.231 }, 00:15:00.231 "claimed": true, 00:15:00.231 "claim_type": "exclusive_write", 00:15:00.231 "zoned": false, 00:15:00.231 "supported_io_types": { 00:15:00.231 "read": true, 00:15:00.231 "write": true, 00:15:00.231 "unmap": true, 00:15:00.231 "write_zeroes": true, 00:15:00.231 "flush": true, 00:15:00.231 "reset": true, 00:15:00.231 "compare": false, 00:15:00.231 "compare_and_write": false, 00:15:00.231 "abort": true, 00:15:00.231 "nvme_admin": false, 00:15:00.231 "nvme_io": false 00:15:00.231 }, 00:15:00.231 "memory_domains": [ 00:15:00.231 { 00:15:00.231 "dma_device_id": "system", 00:15:00.231 "dma_device_type": 1 00:15:00.231 }, 00:15:00.231 { 00:15:00.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.231 "dma_device_type": 2 00:15:00.231 } 00:15:00.231 ], 00:15:00.231 "driver_specific": { 00:15:00.231 "passthru": { 00:15:00.231 "name": "pt3", 00:15:00.231 "base_bdev_name": "malloc3" 00:15:00.231 } 00:15:00.231 } 00:15:00.231 }' 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.231 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:00.490 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:15:00.749 [2024-05-14 11:51:27.676317] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:00.749 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 1709ffb9-5a15-4d8f-a496-dfadb01e8816 '!=' 1709ffb9-5a15-4d8f-a496-dfadb01e8816 ']' 00:15:00.749 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:15:00.749 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:00.749 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:15:00.749 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:01.008 [2024-05-14 11:51:27.912713] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.008 11:51:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:01.267 11:51:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:01.267 "name": "raid_bdev1", 00:15:01.267 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:01.267 "strip_size_kb": 0, 00:15:01.267 "state": "online", 00:15:01.267 "raid_level": "raid1", 00:15:01.267 "superblock": true, 00:15:01.267 "num_base_bdevs": 3, 00:15:01.267 "num_base_bdevs_discovered": 2, 00:15:01.267 "num_base_bdevs_operational": 2, 00:15:01.267 "base_bdevs_list": [ 00:15:01.267 { 00:15:01.267 "name": null, 00:15:01.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.267 "is_configured": false, 00:15:01.267 "data_offset": 2048, 00:15:01.267 "data_size": 63488 00:15:01.267 }, 00:15:01.267 { 00:15:01.267 "name": "pt2", 00:15:01.267 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:01.267 "is_configured": true, 00:15:01.267 "data_offset": 2048, 00:15:01.267 "data_size": 63488 00:15:01.267 }, 00:15:01.267 { 00:15:01.267 "name": "pt3", 00:15:01.267 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:01.267 "is_configured": true, 00:15:01.267 "data_offset": 2048, 00:15:01.267 "data_size": 63488 00:15:01.267 } 00:15:01.267 ] 00:15:01.267 }' 00:15:01.267 11:51:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:01.267 11:51:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.834 11:51:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:02.092 [2024-05-14 11:51:28.971526] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:02.092 [2024-05-14 11:51:28.971555] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:02.092 [2024-05-14 11:51:28.971613] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:02.092 [2024-05-14 11:51:28.971668] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:02.092 [2024-05-14 11:51:28.971680] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d7490 name raid_bdev1, state offline 00:15:02.092 11:51:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.092 11:51:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:15:02.351 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:15:02.351 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:15:02.351 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:15:02.351 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:02.351 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:02.610 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:15:02.610 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:02.610 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:02.868 [2024-05-14 11:51:29.930007] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:02.868 [2024-05-14 11:51:29.930059] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.868 [2024-05-14 11:51:29.930078] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d78a0 00:15:02.868 [2024-05-14 11:51:29.930091] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.868 [2024-05-14 11:51:29.931717] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.868 [2024-05-14 11:51:29.931745] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:02.868 [2024-05-14 11:51:29.931815] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:02.868 [2024-05-14 11:51:29.931842] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:02.868 pt2 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:02.868 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.127 11:51:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.127 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:03.127 "name": "raid_bdev1", 00:15:03.127 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:03.127 "strip_size_kb": 0, 00:15:03.127 "state": "configuring", 00:15:03.127 "raid_level": "raid1", 00:15:03.127 "superblock": true, 00:15:03.127 "num_base_bdevs": 3, 00:15:03.127 "num_base_bdevs_discovered": 1, 00:15:03.127 "num_base_bdevs_operational": 2, 00:15:03.127 "base_bdevs_list": [ 00:15:03.127 { 00:15:03.127 "name": null, 00:15:03.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.127 "is_configured": false, 00:15:03.127 "data_offset": 2048, 00:15:03.127 "data_size": 63488 00:15:03.127 }, 00:15:03.127 { 00:15:03.127 "name": "pt2", 00:15:03.127 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:03.127 "is_configured": true, 00:15:03.127 "data_offset": 2048, 00:15:03.127 "data_size": 63488 00:15:03.127 }, 00:15:03.127 { 00:15:03.127 "name": null, 00:15:03.127 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:03.127 "is_configured": false, 00:15:03.127 "data_offset": 2048, 00:15:03.127 "data_size": 63488 00:15:03.127 } 00:15:03.127 ] 00:15:03.127 }' 00:15:03.127 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:03.127 11:51:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.062 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:15:04.062 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:15:04.062 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=2 00:15:04.062 11:51:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:04.062 [2024-05-14 11:51:31.024920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:04.062 [2024-05-14 11:51:31.024975] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.062 [2024-05-14 11:51:31.024996] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d5b90 00:15:04.062 [2024-05-14 11:51:31.025009] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.062 [2024-05-14 11:51:31.025371] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.062 [2024-05-14 11:51:31.025389] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:04.062 [2024-05-14 11:51:31.025481] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:15:04.062 [2024-05-14 11:51:31.025501] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:04.062 [2024-05-14 11:51:31.025609] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d2860 00:15:04.063 [2024-05-14 11:51:31.025620] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:04.063 [2024-05-14 11:51:31.025789] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d5f10 00:15:04.063 [2024-05-14 11:51:31.025915] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d2860 00:15:04.063 [2024-05-14 11:51:31.025924] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d2860 00:15:04.063 [2024-05-14 11:51:31.026019] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.063 pt3 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.063 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:04.321 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:04.321 "name": "raid_bdev1", 00:15:04.322 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:04.322 "strip_size_kb": 0, 00:15:04.322 "state": "online", 00:15:04.322 "raid_level": "raid1", 00:15:04.322 "superblock": true, 00:15:04.322 "num_base_bdevs": 3, 00:15:04.322 "num_base_bdevs_discovered": 2, 00:15:04.322 "num_base_bdevs_operational": 2, 00:15:04.322 "base_bdevs_list": [ 00:15:04.322 { 00:15:04.322 "name": null, 00:15:04.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.322 "is_configured": false, 00:15:04.322 "data_offset": 2048, 00:15:04.322 "data_size": 63488 00:15:04.322 }, 00:15:04.322 { 00:15:04.322 "name": "pt2", 00:15:04.322 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:04.322 "is_configured": true, 00:15:04.322 "data_offset": 2048, 00:15:04.322 "data_size": 63488 00:15:04.322 }, 00:15:04.322 { 00:15:04.322 "name": "pt3", 00:15:04.322 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:04.322 "is_configured": true, 00:15:04.322 "data_offset": 2048, 00:15:04.322 "data_size": 63488 00:15:04.322 } 00:15:04.322 ] 00:15:04.322 }' 00:15:04.322 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:04.322 11:51:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.889 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 3 -gt 2 ']' 00:15:04.889 11:51:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:05.147 [2024-05-14 11:51:32.099742] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:05.147 [2024-05-14 11:51:32.099770] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:05.147 [2024-05-14 11:51:32.099828] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:05.147 [2024-05-14 11:51:32.099881] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:05.147 [2024-05-14 11:51:32.099893] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d2860 name raid_bdev1, state offline 00:15:05.147 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.147 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # jq -r '.[]' 00:15:05.420 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # raid_bdev= 00:15:05.420 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@529 -- # '[' -n '' ']' 00:15:05.421 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:05.684 [2024-05-14 11:51:32.585015] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:05.684 [2024-05-14 11:51:32.585065] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:05.684 [2024-05-14 11:51:32.585086] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21da480 00:15:05.684 [2024-05-14 11:51:32.585099] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:05.684 [2024-05-14 11:51:32.586736] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:05.684 [2024-05-14 11:51:32.586766] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:05.684 [2024-05-14 11:51:32.586836] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:15:05.684 [2024-05-14 11:51:32.586862] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:05.684 pt1 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.684 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.943 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:05.943 "name": "raid_bdev1", 00:15:05.943 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:05.943 "strip_size_kb": 0, 00:15:05.943 "state": "configuring", 00:15:05.943 "raid_level": "raid1", 00:15:05.943 "superblock": true, 00:15:05.943 "num_base_bdevs": 3, 00:15:05.943 "num_base_bdevs_discovered": 1, 00:15:05.943 "num_base_bdevs_operational": 3, 00:15:05.943 "base_bdevs_list": [ 00:15:05.943 { 00:15:05.943 "name": "pt1", 00:15:05.943 "uuid": "7a138f98-1976-5afe-be81-6fda481a3e93", 00:15:05.943 "is_configured": true, 00:15:05.943 "data_offset": 2048, 00:15:05.943 "data_size": 63488 00:15:05.943 }, 00:15:05.943 { 00:15:05.943 "name": null, 00:15:05.943 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:05.943 "is_configured": false, 00:15:05.943 "data_offset": 2048, 00:15:05.943 "data_size": 63488 00:15:05.943 }, 00:15:05.943 { 00:15:05.943 "name": null, 00:15:05.943 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:05.943 "is_configured": false, 00:15:05.943 "data_offset": 2048, 00:15:05.943 "data_size": 63488 00:15:05.943 } 00:15:05.943 ] 00:15:05.943 }' 00:15:05.943 11:51:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:05.943 11:51:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.510 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i = 1 )) 00:15:06.510 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:06.510 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:06.769 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:15:06.769 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:06.769 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:07.028 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:15:07.029 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:15:07.029 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # i=2 00:15:07.029 11:51:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:07.343 [2024-05-14 11:51:34.149147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:07.343 [2024-05-14 11:51:34.149206] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.343 [2024-05-14 11:51:34.149225] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d5b90 00:15:07.343 [2024-05-14 11:51:34.149238] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.343 [2024-05-14 11:51:34.149608] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.343 [2024-05-14 11:51:34.149627] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:07.343 [2024-05-14 11:51:34.149691] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:15:07.343 [2024-05-14 11:51:34.149703] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt3 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:07.343 [2024-05-14 11:51:34.149714] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:07.343 [2024-05-14 11:51:34.149730] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d9850 name raid_bdev1, state configuring 00:15:07.343 [2024-05-14 11:51:34.149760] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:07.343 pt3 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@551 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:07.343 "name": "raid_bdev1", 00:15:07.343 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:07.343 "strip_size_kb": 0, 00:15:07.343 "state": "configuring", 00:15:07.343 "raid_level": "raid1", 00:15:07.343 "superblock": true, 00:15:07.343 "num_base_bdevs": 3, 00:15:07.343 "num_base_bdevs_discovered": 1, 00:15:07.343 "num_base_bdevs_operational": 2, 00:15:07.343 "base_bdevs_list": [ 00:15:07.343 { 00:15:07.343 "name": null, 00:15:07.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.343 "is_configured": false, 00:15:07.343 "data_offset": 2048, 00:15:07.343 "data_size": 63488 00:15:07.343 }, 00:15:07.343 { 00:15:07.343 "name": null, 00:15:07.343 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:07.343 "is_configured": false, 00:15:07.343 "data_offset": 2048, 00:15:07.343 "data_size": 63488 00:15:07.343 }, 00:15:07.343 { 00:15:07.343 "name": "pt3", 00:15:07.343 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:07.343 "is_configured": true, 00:15:07.343 "data_offset": 2048, 00:15:07.343 "data_size": 63488 00:15:07.343 } 00:15:07.343 ] 00:15:07.343 }' 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:07.343 11:51:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.911 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i = 1 )) 00:15:07.911 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:15:07.911 11:51:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:08.171 [2024-05-14 11:51:35.163855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:08.171 [2024-05-14 11:51:35.163909] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.171 [2024-05-14 11:51:35.163935] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d47b0 00:15:08.171 [2024-05-14 11:51:35.163948] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.171 [2024-05-14 11:51:35.164306] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.171 [2024-05-14 11:51:35.164323] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:08.171 [2024-05-14 11:51:35.164391] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:15:08.171 [2024-05-14 11:51:35.164425] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:08.171 [2024-05-14 11:51:35.164530] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d9af0 00:15:08.171 [2024-05-14 11:51:35.164540] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:08.171 [2024-05-14 11:51:35.164708] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d4e50 00:15:08.171 [2024-05-14 11:51:35.164837] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d9af0 00:15:08.171 [2024-05-14 11:51:35.164847] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d9af0 00:15:08.171 [2024-05-14 11:51:35.164945] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:08.171 pt2 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@559 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.171 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:08.430 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:08.430 "name": "raid_bdev1", 00:15:08.430 "uuid": "1709ffb9-5a15-4d8f-a496-dfadb01e8816", 00:15:08.430 "strip_size_kb": 0, 00:15:08.430 "state": "online", 00:15:08.430 "raid_level": "raid1", 00:15:08.430 "superblock": true, 00:15:08.430 "num_base_bdevs": 3, 00:15:08.430 "num_base_bdevs_discovered": 2, 00:15:08.430 "num_base_bdevs_operational": 2, 00:15:08.430 "base_bdevs_list": [ 00:15:08.430 { 00:15:08.430 "name": null, 00:15:08.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.430 "is_configured": false, 00:15:08.430 "data_offset": 2048, 00:15:08.430 "data_size": 63488 00:15:08.430 }, 00:15:08.430 { 00:15:08.430 "name": "pt2", 00:15:08.430 "uuid": "81eda2c3-8b13-53bc-b2da-bb908b6804ca", 00:15:08.430 "is_configured": true, 00:15:08.430 "data_offset": 2048, 00:15:08.430 "data_size": 63488 00:15:08.430 }, 00:15:08.430 { 00:15:08.430 "name": "pt3", 00:15:08.431 "uuid": "838c032d-6616-5a1e-8321-cc66aece320e", 00:15:08.431 "is_configured": true, 00:15:08.431 "data_offset": 2048, 00:15:08.431 "data_size": 63488 00:15:08.431 } 00:15:08.431 ] 00:15:08.431 }' 00:15:08.431 11:51:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:08.431 11:51:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.998 11:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:08.998 11:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:15:09.258 [2024-05-14 11:51:36.246979] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' 1709ffb9-5a15-4d8f-a496-dfadb01e8816 '!=' 1709ffb9-5a15-4d8f-a496-dfadb01e8816 ']' 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1708833 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1708833 ']' 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1708833 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1708833 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1708833' 00:15:09.258 killing process with pid 1708833 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1708833 00:15:09.258 [2024-05-14 11:51:36.317284] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:09.258 [2024-05-14 11:51:36.317351] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:09.258 [2024-05-14 11:51:36.317421] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:09.258 [2024-05-14 11:51:36.317435] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d9af0 name raid_bdev1, state offline 00:15:09.258 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1708833 00:15:09.517 [2024-05-14 11:51:36.343990] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:09.517 11:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:15:09.517 00:15:09.517 real 0m22.216s 00:15:09.517 user 0m40.668s 00:15:09.517 sys 0m3.932s 00:15:09.517 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:09.517 11:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.517 ************************************ 00:15:09.517 END TEST raid_superblock_test 00:15:09.517 ************************************ 00:15:09.517 11:51:36 bdev_raid -- bdev/bdev_raid.sh@813 -- # for n in {2..4} 00:15:09.517 11:51:36 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:15:09.517 11:51:36 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:15:09.517 11:51:36 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:15:09.517 11:51:36 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:09.517 11:51:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:09.777 ************************************ 00:15:09.777 START TEST raid_state_function_test 00:15:09.777 ************************************ 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 false 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1712295 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1712295' 00:15:09.777 Process raid pid: 1712295 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1712295 /var/tmp/spdk-raid.sock 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1712295 ']' 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:09.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:09.777 11:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.777 [2024-05-14 11:51:36.699691] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:15:09.777 [2024-05-14 11:51:36.699751] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:09.777 [2024-05-14 11:51:36.826949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:10.037 [2024-05-14 11:51:36.934287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:10.037 [2024-05-14 11:51:36.994970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:10.037 [2024-05-14 11:51:36.995000] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:10.605 11:51:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:10.605 11:51:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:15:10.605 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:10.864 [2024-05-14 11:51:37.854161] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:10.865 [2024-05-14 11:51:37.854202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:10.865 [2024-05-14 11:51:37.854213] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:10.865 [2024-05-14 11:51:37.854225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:10.865 [2024-05-14 11:51:37.854234] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:10.865 [2024-05-14 11:51:37.854246] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:10.865 [2024-05-14 11:51:37.854255] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:10.865 [2024-05-14 11:51:37.854266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.865 11:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.124 11:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:11.124 "name": "Existed_Raid", 00:15:11.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.124 "strip_size_kb": 64, 00:15:11.124 "state": "configuring", 00:15:11.124 "raid_level": "raid0", 00:15:11.124 "superblock": false, 00:15:11.124 "num_base_bdevs": 4, 00:15:11.124 "num_base_bdevs_discovered": 0, 00:15:11.124 "num_base_bdevs_operational": 4, 00:15:11.124 "base_bdevs_list": [ 00:15:11.124 { 00:15:11.124 "name": "BaseBdev1", 00:15:11.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.124 "is_configured": false, 00:15:11.124 "data_offset": 0, 00:15:11.124 "data_size": 0 00:15:11.124 }, 00:15:11.124 { 00:15:11.124 "name": "BaseBdev2", 00:15:11.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.124 "is_configured": false, 00:15:11.124 "data_offset": 0, 00:15:11.124 "data_size": 0 00:15:11.124 }, 00:15:11.124 { 00:15:11.124 "name": "BaseBdev3", 00:15:11.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.124 "is_configured": false, 00:15:11.124 "data_offset": 0, 00:15:11.124 "data_size": 0 00:15:11.124 }, 00:15:11.124 { 00:15:11.124 "name": "BaseBdev4", 00:15:11.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.124 "is_configured": false, 00:15:11.124 "data_offset": 0, 00:15:11.124 "data_size": 0 00:15:11.124 } 00:15:11.124 ] 00:15:11.124 }' 00:15:11.124 11:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:11.124 11:51:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.693 11:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:11.952 [2024-05-14 11:51:38.925021] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:11.952 [2024-05-14 11:51:38.925056] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a50720 name Existed_Raid, state configuring 00:15:11.952 11:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:12.212 [2024-05-14 11:51:39.173695] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:12.212 [2024-05-14 11:51:39.173723] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:12.212 [2024-05-14 11:51:39.173733] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:12.212 [2024-05-14 11:51:39.173745] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:12.212 [2024-05-14 11:51:39.173754] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:12.212 [2024-05-14 11:51:39.173765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:12.212 [2024-05-14 11:51:39.173775] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:12.212 [2024-05-14 11:51:39.173786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:12.212 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:12.472 [2024-05-14 11:51:39.432270] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.472 BaseBdev1 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:12.472 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.731 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:12.991 [ 00:15:12.991 { 00:15:12.991 "name": "BaseBdev1", 00:15:12.991 "aliases": [ 00:15:12.991 "8e7445e4-11c5-4cc2-8105-2452a750a5e9" 00:15:12.991 ], 00:15:12.991 "product_name": "Malloc disk", 00:15:12.991 "block_size": 512, 00:15:12.991 "num_blocks": 65536, 00:15:12.991 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:12.991 "assigned_rate_limits": { 00:15:12.991 "rw_ios_per_sec": 0, 00:15:12.991 "rw_mbytes_per_sec": 0, 00:15:12.991 "r_mbytes_per_sec": 0, 00:15:12.991 "w_mbytes_per_sec": 0 00:15:12.991 }, 00:15:12.991 "claimed": true, 00:15:12.991 "claim_type": "exclusive_write", 00:15:12.991 "zoned": false, 00:15:12.991 "supported_io_types": { 00:15:12.991 "read": true, 00:15:12.991 "write": true, 00:15:12.991 "unmap": true, 00:15:12.991 "write_zeroes": true, 00:15:12.991 "flush": true, 00:15:12.991 "reset": true, 00:15:12.991 "compare": false, 00:15:12.991 "compare_and_write": false, 00:15:12.991 "abort": true, 00:15:12.991 "nvme_admin": false, 00:15:12.991 "nvme_io": false 00:15:12.991 }, 00:15:12.991 "memory_domains": [ 00:15:12.991 { 00:15:12.991 "dma_device_id": "system", 00:15:12.991 "dma_device_type": 1 00:15:12.991 }, 00:15:12.991 { 00:15:12.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.991 "dma_device_type": 2 00:15:12.991 } 00:15:12.991 ], 00:15:12.991 "driver_specific": {} 00:15:12.991 } 00:15:12.991 ] 00:15:12.991 11:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:12.991 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.992 11:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.252 11:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:13.252 "name": "Existed_Raid", 00:15:13.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.252 "strip_size_kb": 64, 00:15:13.252 "state": "configuring", 00:15:13.252 "raid_level": "raid0", 00:15:13.252 "superblock": false, 00:15:13.252 "num_base_bdevs": 4, 00:15:13.252 "num_base_bdevs_discovered": 1, 00:15:13.252 "num_base_bdevs_operational": 4, 00:15:13.252 "base_bdevs_list": [ 00:15:13.252 { 00:15:13.252 "name": "BaseBdev1", 00:15:13.252 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:13.252 "is_configured": true, 00:15:13.252 "data_offset": 0, 00:15:13.252 "data_size": 65536 00:15:13.252 }, 00:15:13.252 { 00:15:13.252 "name": "BaseBdev2", 00:15:13.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.252 "is_configured": false, 00:15:13.252 "data_offset": 0, 00:15:13.252 "data_size": 0 00:15:13.252 }, 00:15:13.252 { 00:15:13.252 "name": "BaseBdev3", 00:15:13.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.252 "is_configured": false, 00:15:13.252 "data_offset": 0, 00:15:13.252 "data_size": 0 00:15:13.252 }, 00:15:13.252 { 00:15:13.252 "name": "BaseBdev4", 00:15:13.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.252 "is_configured": false, 00:15:13.252 "data_offset": 0, 00:15:13.252 "data_size": 0 00:15:13.252 } 00:15:13.252 ] 00:15:13.252 }' 00:15:13.252 11:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:13.253 11:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.830 11:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:14.089 [2024-05-14 11:51:41.000474] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:14.089 [2024-05-14 11:51:41.000512] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a4ffb0 name Existed_Raid, state configuring 00:15:14.089 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:14.348 [2024-05-14 11:51:41.245151] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:14.348 [2024-05-14 11:51:41.246640] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:14.348 [2024-05-14 11:51:41.246672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:14.348 [2024-05-14 11:51:41.246683] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:14.348 [2024-05-14 11:51:41.246695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:14.348 [2024-05-14 11:51:41.246705] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:14.348 [2024-05-14 11:51:41.246716] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:14.348 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:14.349 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:14.349 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:14.349 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:14.349 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.349 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.609 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:14.609 "name": "Existed_Raid", 00:15:14.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.609 "strip_size_kb": 64, 00:15:14.609 "state": "configuring", 00:15:14.609 "raid_level": "raid0", 00:15:14.609 "superblock": false, 00:15:14.609 "num_base_bdevs": 4, 00:15:14.609 "num_base_bdevs_discovered": 1, 00:15:14.609 "num_base_bdevs_operational": 4, 00:15:14.609 "base_bdevs_list": [ 00:15:14.609 { 00:15:14.609 "name": "BaseBdev1", 00:15:14.609 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:14.609 "is_configured": true, 00:15:14.609 "data_offset": 0, 00:15:14.609 "data_size": 65536 00:15:14.609 }, 00:15:14.609 { 00:15:14.609 "name": "BaseBdev2", 00:15:14.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.609 "is_configured": false, 00:15:14.609 "data_offset": 0, 00:15:14.609 "data_size": 0 00:15:14.609 }, 00:15:14.609 { 00:15:14.609 "name": "BaseBdev3", 00:15:14.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.609 "is_configured": false, 00:15:14.609 "data_offset": 0, 00:15:14.609 "data_size": 0 00:15:14.609 }, 00:15:14.609 { 00:15:14.609 "name": "BaseBdev4", 00:15:14.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.609 "is_configured": false, 00:15:14.609 "data_offset": 0, 00:15:14.609 "data_size": 0 00:15:14.609 } 00:15:14.609 ] 00:15:14.609 }' 00:15:14.609 11:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:14.609 11:51:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.177 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:15.436 [2024-05-14 11:51:42.347464] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:15.436 BaseBdev2 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:15.436 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.713 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:15.972 [ 00:15:15.972 { 00:15:15.972 "name": "BaseBdev2", 00:15:15.972 "aliases": [ 00:15:15.972 "10006ce6-9f73-4e41-8d5b-f4c5c606e92e" 00:15:15.972 ], 00:15:15.972 "product_name": "Malloc disk", 00:15:15.972 "block_size": 512, 00:15:15.972 "num_blocks": 65536, 00:15:15.972 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:15.972 "assigned_rate_limits": { 00:15:15.972 "rw_ios_per_sec": 0, 00:15:15.972 "rw_mbytes_per_sec": 0, 00:15:15.972 "r_mbytes_per_sec": 0, 00:15:15.972 "w_mbytes_per_sec": 0 00:15:15.972 }, 00:15:15.972 "claimed": true, 00:15:15.972 "claim_type": "exclusive_write", 00:15:15.972 "zoned": false, 00:15:15.972 "supported_io_types": { 00:15:15.972 "read": true, 00:15:15.972 "write": true, 00:15:15.972 "unmap": true, 00:15:15.972 "write_zeroes": true, 00:15:15.972 "flush": true, 00:15:15.972 "reset": true, 00:15:15.972 "compare": false, 00:15:15.972 "compare_and_write": false, 00:15:15.972 "abort": true, 00:15:15.972 "nvme_admin": false, 00:15:15.972 "nvme_io": false 00:15:15.972 }, 00:15:15.972 "memory_domains": [ 00:15:15.972 { 00:15:15.972 "dma_device_id": "system", 00:15:15.972 "dma_device_type": 1 00:15:15.972 }, 00:15:15.972 { 00:15:15.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.972 "dma_device_type": 2 00:15:15.972 } 00:15:15.972 ], 00:15:15.972 "driver_specific": {} 00:15:15.972 } 00:15:15.972 ] 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.972 11:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.232 11:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:16.232 "name": "Existed_Raid", 00:15:16.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.232 "strip_size_kb": 64, 00:15:16.232 "state": "configuring", 00:15:16.232 "raid_level": "raid0", 00:15:16.232 "superblock": false, 00:15:16.232 "num_base_bdevs": 4, 00:15:16.232 "num_base_bdevs_discovered": 2, 00:15:16.232 "num_base_bdevs_operational": 4, 00:15:16.232 "base_bdevs_list": [ 00:15:16.232 { 00:15:16.232 "name": "BaseBdev1", 00:15:16.232 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:16.232 "is_configured": true, 00:15:16.232 "data_offset": 0, 00:15:16.232 "data_size": 65536 00:15:16.232 }, 00:15:16.232 { 00:15:16.232 "name": "BaseBdev2", 00:15:16.232 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:16.232 "is_configured": true, 00:15:16.232 "data_offset": 0, 00:15:16.232 "data_size": 65536 00:15:16.232 }, 00:15:16.232 { 00:15:16.232 "name": "BaseBdev3", 00:15:16.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.232 "is_configured": false, 00:15:16.232 "data_offset": 0, 00:15:16.232 "data_size": 0 00:15:16.232 }, 00:15:16.232 { 00:15:16.232 "name": "BaseBdev4", 00:15:16.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.232 "is_configured": false, 00:15:16.232 "data_offset": 0, 00:15:16.232 "data_size": 0 00:15:16.232 } 00:15:16.232 ] 00:15:16.232 }' 00:15:16.232 11:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:16.232 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:16.802 [2024-05-14 11:51:43.850863] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.802 BaseBdev3 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:16.802 11:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.061 11:51:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:17.321 [ 00:15:17.321 { 00:15:17.321 "name": "BaseBdev3", 00:15:17.321 "aliases": [ 00:15:17.321 "5de72523-fce3-449d-804f-9e583c5f15f9" 00:15:17.321 ], 00:15:17.321 "product_name": "Malloc disk", 00:15:17.321 "block_size": 512, 00:15:17.321 "num_blocks": 65536, 00:15:17.321 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:17.321 "assigned_rate_limits": { 00:15:17.321 "rw_ios_per_sec": 0, 00:15:17.321 "rw_mbytes_per_sec": 0, 00:15:17.321 "r_mbytes_per_sec": 0, 00:15:17.321 "w_mbytes_per_sec": 0 00:15:17.321 }, 00:15:17.321 "claimed": true, 00:15:17.321 "claim_type": "exclusive_write", 00:15:17.321 "zoned": false, 00:15:17.321 "supported_io_types": { 00:15:17.321 "read": true, 00:15:17.321 "write": true, 00:15:17.321 "unmap": true, 00:15:17.321 "write_zeroes": true, 00:15:17.321 "flush": true, 00:15:17.321 "reset": true, 00:15:17.321 "compare": false, 00:15:17.321 "compare_and_write": false, 00:15:17.321 "abort": true, 00:15:17.321 "nvme_admin": false, 00:15:17.321 "nvme_io": false 00:15:17.321 }, 00:15:17.321 "memory_domains": [ 00:15:17.321 { 00:15:17.321 "dma_device_id": "system", 00:15:17.321 "dma_device_type": 1 00:15:17.321 }, 00:15:17.321 { 00:15:17.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.321 "dma_device_type": 2 00:15:17.321 } 00:15:17.321 ], 00:15:17.321 "driver_specific": {} 00:15:17.321 } 00:15:17.321 ] 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.321 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.581 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:17.581 "name": "Existed_Raid", 00:15:17.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.581 "strip_size_kb": 64, 00:15:17.581 "state": "configuring", 00:15:17.581 "raid_level": "raid0", 00:15:17.581 "superblock": false, 00:15:17.581 "num_base_bdevs": 4, 00:15:17.581 "num_base_bdevs_discovered": 3, 00:15:17.581 "num_base_bdevs_operational": 4, 00:15:17.581 "base_bdevs_list": [ 00:15:17.581 { 00:15:17.581 "name": "BaseBdev1", 00:15:17.581 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:17.581 "is_configured": true, 00:15:17.581 "data_offset": 0, 00:15:17.581 "data_size": 65536 00:15:17.581 }, 00:15:17.581 { 00:15:17.581 "name": "BaseBdev2", 00:15:17.581 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:17.581 "is_configured": true, 00:15:17.581 "data_offset": 0, 00:15:17.581 "data_size": 65536 00:15:17.581 }, 00:15:17.581 { 00:15:17.581 "name": "BaseBdev3", 00:15:17.581 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:17.581 "is_configured": true, 00:15:17.581 "data_offset": 0, 00:15:17.581 "data_size": 65536 00:15:17.581 }, 00:15:17.581 { 00:15:17.581 "name": "BaseBdev4", 00:15:17.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.581 "is_configured": false, 00:15:17.581 "data_offset": 0, 00:15:17.581 "data_size": 0 00:15:17.581 } 00:15:17.581 ] 00:15:17.581 }' 00:15:17.581 11:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:17.581 11:51:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.151 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:18.411 [2024-05-14 11:51:45.434510] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:18.411 [2024-05-14 11:51:45.434552] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a511b0 00:15:18.411 [2024-05-14 11:51:45.434561] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:18.411 [2024-05-14 11:51:45.434763] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a52860 00:15:18.411 [2024-05-14 11:51:45.434889] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a511b0 00:15:18.411 [2024-05-14 11:51:45.434899] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a511b0 00:15:18.411 [2024-05-14 11:51:45.435065] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:18.411 BaseBdev4 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:18.411 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.670 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:18.930 [ 00:15:18.930 { 00:15:18.930 "name": "BaseBdev4", 00:15:18.930 "aliases": [ 00:15:18.930 "a464a00f-cee6-4828-b363-20a3ebb4458e" 00:15:18.930 ], 00:15:18.930 "product_name": "Malloc disk", 00:15:18.930 "block_size": 512, 00:15:18.930 "num_blocks": 65536, 00:15:18.930 "uuid": "a464a00f-cee6-4828-b363-20a3ebb4458e", 00:15:18.930 "assigned_rate_limits": { 00:15:18.930 "rw_ios_per_sec": 0, 00:15:18.930 "rw_mbytes_per_sec": 0, 00:15:18.930 "r_mbytes_per_sec": 0, 00:15:18.930 "w_mbytes_per_sec": 0 00:15:18.930 }, 00:15:18.930 "claimed": true, 00:15:18.930 "claim_type": "exclusive_write", 00:15:18.930 "zoned": false, 00:15:18.930 "supported_io_types": { 00:15:18.930 "read": true, 00:15:18.930 "write": true, 00:15:18.930 "unmap": true, 00:15:18.930 "write_zeroes": true, 00:15:18.930 "flush": true, 00:15:18.930 "reset": true, 00:15:18.930 "compare": false, 00:15:18.930 "compare_and_write": false, 00:15:18.930 "abort": true, 00:15:18.930 "nvme_admin": false, 00:15:18.930 "nvme_io": false 00:15:18.930 }, 00:15:18.930 "memory_domains": [ 00:15:18.930 { 00:15:18.930 "dma_device_id": "system", 00:15:18.930 "dma_device_type": 1 00:15:18.930 }, 00:15:18.930 { 00:15:18.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.930 "dma_device_type": 2 00:15:18.930 } 00:15:18.930 ], 00:15:18.930 "driver_specific": {} 00:15:18.930 } 00:15:18.930 ] 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.930 11:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.190 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:19.190 "name": "Existed_Raid", 00:15:19.190 "uuid": "74d770d4-abd1-4ea2-8c9e-3aa5f7e39a0e", 00:15:19.190 "strip_size_kb": 64, 00:15:19.190 "state": "online", 00:15:19.190 "raid_level": "raid0", 00:15:19.190 "superblock": false, 00:15:19.190 "num_base_bdevs": 4, 00:15:19.190 "num_base_bdevs_discovered": 4, 00:15:19.190 "num_base_bdevs_operational": 4, 00:15:19.190 "base_bdevs_list": [ 00:15:19.190 { 00:15:19.190 "name": "BaseBdev1", 00:15:19.190 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:19.190 "is_configured": true, 00:15:19.190 "data_offset": 0, 00:15:19.190 "data_size": 65536 00:15:19.190 }, 00:15:19.190 { 00:15:19.190 "name": "BaseBdev2", 00:15:19.190 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:19.190 "is_configured": true, 00:15:19.190 "data_offset": 0, 00:15:19.190 "data_size": 65536 00:15:19.190 }, 00:15:19.190 { 00:15:19.190 "name": "BaseBdev3", 00:15:19.190 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:19.190 "is_configured": true, 00:15:19.190 "data_offset": 0, 00:15:19.190 "data_size": 65536 00:15:19.190 }, 00:15:19.190 { 00:15:19.190 "name": "BaseBdev4", 00:15:19.190 "uuid": "a464a00f-cee6-4828-b363-20a3ebb4458e", 00:15:19.190 "is_configured": true, 00:15:19.190 "data_offset": 0, 00:15:19.190 "data_size": 65536 00:15:19.190 } 00:15:19.190 ] 00:15:19.190 }' 00:15:19.190 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:19.190 11:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:19.760 11:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:20.019 [2024-05-14 11:51:46.986931] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:20.019 "name": "Existed_Raid", 00:15:20.019 "aliases": [ 00:15:20.019 "74d770d4-abd1-4ea2-8c9e-3aa5f7e39a0e" 00:15:20.019 ], 00:15:20.019 "product_name": "Raid Volume", 00:15:20.019 "block_size": 512, 00:15:20.019 "num_blocks": 262144, 00:15:20.019 "uuid": "74d770d4-abd1-4ea2-8c9e-3aa5f7e39a0e", 00:15:20.019 "assigned_rate_limits": { 00:15:20.019 "rw_ios_per_sec": 0, 00:15:20.019 "rw_mbytes_per_sec": 0, 00:15:20.019 "r_mbytes_per_sec": 0, 00:15:20.019 "w_mbytes_per_sec": 0 00:15:20.019 }, 00:15:20.019 "claimed": false, 00:15:20.019 "zoned": false, 00:15:20.019 "supported_io_types": { 00:15:20.019 "read": true, 00:15:20.019 "write": true, 00:15:20.019 "unmap": true, 00:15:20.019 "write_zeroes": true, 00:15:20.019 "flush": true, 00:15:20.019 "reset": true, 00:15:20.019 "compare": false, 00:15:20.019 "compare_and_write": false, 00:15:20.019 "abort": false, 00:15:20.019 "nvme_admin": false, 00:15:20.019 "nvme_io": false 00:15:20.019 }, 00:15:20.019 "memory_domains": [ 00:15:20.019 { 00:15:20.019 "dma_device_id": "system", 00:15:20.019 "dma_device_type": 1 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.019 "dma_device_type": 2 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "system", 00:15:20.019 "dma_device_type": 1 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.019 "dma_device_type": 2 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "system", 00:15:20.019 "dma_device_type": 1 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.019 "dma_device_type": 2 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "system", 00:15:20.019 "dma_device_type": 1 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.019 "dma_device_type": 2 00:15:20.019 } 00:15:20.019 ], 00:15:20.019 "driver_specific": { 00:15:20.019 "raid": { 00:15:20.019 "uuid": "74d770d4-abd1-4ea2-8c9e-3aa5f7e39a0e", 00:15:20.019 "strip_size_kb": 64, 00:15:20.019 "state": "online", 00:15:20.019 "raid_level": "raid0", 00:15:20.019 "superblock": false, 00:15:20.019 "num_base_bdevs": 4, 00:15:20.019 "num_base_bdevs_discovered": 4, 00:15:20.019 "num_base_bdevs_operational": 4, 00:15:20.019 "base_bdevs_list": [ 00:15:20.019 { 00:15:20.019 "name": "BaseBdev1", 00:15:20.019 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:20.019 "is_configured": true, 00:15:20.019 "data_offset": 0, 00:15:20.019 "data_size": 65536 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "name": "BaseBdev2", 00:15:20.019 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:20.019 "is_configured": true, 00:15:20.019 "data_offset": 0, 00:15:20.019 "data_size": 65536 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "name": "BaseBdev3", 00:15:20.019 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:20.019 "is_configured": true, 00:15:20.019 "data_offset": 0, 00:15:20.019 "data_size": 65536 00:15:20.019 }, 00:15:20.019 { 00:15:20.019 "name": "BaseBdev4", 00:15:20.019 "uuid": "a464a00f-cee6-4828-b363-20a3ebb4458e", 00:15:20.019 "is_configured": true, 00:15:20.019 "data_offset": 0, 00:15:20.019 "data_size": 65536 00:15:20.019 } 00:15:20.019 ] 00:15:20.019 } 00:15:20.019 } 00:15:20.019 }' 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:20.019 BaseBdev2 00:15:20.019 BaseBdev3 00:15:20.019 BaseBdev4' 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:20.019 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:20.278 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:20.278 "name": "BaseBdev1", 00:15:20.278 "aliases": [ 00:15:20.278 "8e7445e4-11c5-4cc2-8105-2452a750a5e9" 00:15:20.278 ], 00:15:20.278 "product_name": "Malloc disk", 00:15:20.278 "block_size": 512, 00:15:20.278 "num_blocks": 65536, 00:15:20.278 "uuid": "8e7445e4-11c5-4cc2-8105-2452a750a5e9", 00:15:20.278 "assigned_rate_limits": { 00:15:20.278 "rw_ios_per_sec": 0, 00:15:20.278 "rw_mbytes_per_sec": 0, 00:15:20.278 "r_mbytes_per_sec": 0, 00:15:20.278 "w_mbytes_per_sec": 0 00:15:20.278 }, 00:15:20.278 "claimed": true, 00:15:20.278 "claim_type": "exclusive_write", 00:15:20.278 "zoned": false, 00:15:20.278 "supported_io_types": { 00:15:20.278 "read": true, 00:15:20.278 "write": true, 00:15:20.278 "unmap": true, 00:15:20.278 "write_zeroes": true, 00:15:20.278 "flush": true, 00:15:20.278 "reset": true, 00:15:20.278 "compare": false, 00:15:20.278 "compare_and_write": false, 00:15:20.278 "abort": true, 00:15:20.278 "nvme_admin": false, 00:15:20.278 "nvme_io": false 00:15:20.278 }, 00:15:20.278 "memory_domains": [ 00:15:20.278 { 00:15:20.278 "dma_device_id": "system", 00:15:20.278 "dma_device_type": 1 00:15:20.278 }, 00:15:20.278 { 00:15:20.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.278 "dma_device_type": 2 00:15:20.278 } 00:15:20.278 ], 00:15:20.278 "driver_specific": {} 00:15:20.278 }' 00:15:20.278 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:20.278 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:20.537 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:20.796 "name": "BaseBdev2", 00:15:20.796 "aliases": [ 00:15:20.796 "10006ce6-9f73-4e41-8d5b-f4c5c606e92e" 00:15:20.796 ], 00:15:20.796 "product_name": "Malloc disk", 00:15:20.796 "block_size": 512, 00:15:20.796 "num_blocks": 65536, 00:15:20.796 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:20.796 "assigned_rate_limits": { 00:15:20.796 "rw_ios_per_sec": 0, 00:15:20.796 "rw_mbytes_per_sec": 0, 00:15:20.796 "r_mbytes_per_sec": 0, 00:15:20.796 "w_mbytes_per_sec": 0 00:15:20.796 }, 00:15:20.796 "claimed": true, 00:15:20.796 "claim_type": "exclusive_write", 00:15:20.796 "zoned": false, 00:15:20.796 "supported_io_types": { 00:15:20.796 "read": true, 00:15:20.796 "write": true, 00:15:20.796 "unmap": true, 00:15:20.796 "write_zeroes": true, 00:15:20.796 "flush": true, 00:15:20.796 "reset": true, 00:15:20.796 "compare": false, 00:15:20.796 "compare_and_write": false, 00:15:20.796 "abort": true, 00:15:20.796 "nvme_admin": false, 00:15:20.796 "nvme_io": false 00:15:20.796 }, 00:15:20.796 "memory_domains": [ 00:15:20.796 { 00:15:20.796 "dma_device_id": "system", 00:15:20.796 "dma_device_type": 1 00:15:20.796 }, 00:15:20.796 { 00:15:20.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.796 "dma_device_type": 2 00:15:20.796 } 00:15:20.796 ], 00:15:20.796 "driver_specific": {} 00:15:20.796 }' 00:15:20.796 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:21.055 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:21.055 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:21.055 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:21.055 11:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:21.055 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:21.055 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:21.055 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:21.056 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:21.056 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:21.379 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:21.379 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:21.379 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:21.379 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:21.379 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:21.638 "name": "BaseBdev3", 00:15:21.638 "aliases": [ 00:15:21.638 "5de72523-fce3-449d-804f-9e583c5f15f9" 00:15:21.638 ], 00:15:21.638 "product_name": "Malloc disk", 00:15:21.638 "block_size": 512, 00:15:21.638 "num_blocks": 65536, 00:15:21.638 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:21.638 "assigned_rate_limits": { 00:15:21.638 "rw_ios_per_sec": 0, 00:15:21.638 "rw_mbytes_per_sec": 0, 00:15:21.638 "r_mbytes_per_sec": 0, 00:15:21.638 "w_mbytes_per_sec": 0 00:15:21.638 }, 00:15:21.638 "claimed": true, 00:15:21.638 "claim_type": "exclusive_write", 00:15:21.638 "zoned": false, 00:15:21.638 "supported_io_types": { 00:15:21.638 "read": true, 00:15:21.638 "write": true, 00:15:21.638 "unmap": true, 00:15:21.638 "write_zeroes": true, 00:15:21.638 "flush": true, 00:15:21.638 "reset": true, 00:15:21.638 "compare": false, 00:15:21.638 "compare_and_write": false, 00:15:21.638 "abort": true, 00:15:21.638 "nvme_admin": false, 00:15:21.638 "nvme_io": false 00:15:21.638 }, 00:15:21.638 "memory_domains": [ 00:15:21.638 { 00:15:21.638 "dma_device_id": "system", 00:15:21.638 "dma_device_type": 1 00:15:21.638 }, 00:15:21.638 { 00:15:21.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.638 "dma_device_type": 2 00:15:21.638 } 00:15:21.638 ], 00:15:21.638 "driver_specific": {} 00:15:21.638 }' 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:21.638 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:21.897 11:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:22.155 "name": "BaseBdev4", 00:15:22.155 "aliases": [ 00:15:22.155 "a464a00f-cee6-4828-b363-20a3ebb4458e" 00:15:22.155 ], 00:15:22.155 "product_name": "Malloc disk", 00:15:22.155 "block_size": 512, 00:15:22.155 "num_blocks": 65536, 00:15:22.155 "uuid": "a464a00f-cee6-4828-b363-20a3ebb4458e", 00:15:22.155 "assigned_rate_limits": { 00:15:22.155 "rw_ios_per_sec": 0, 00:15:22.155 "rw_mbytes_per_sec": 0, 00:15:22.155 "r_mbytes_per_sec": 0, 00:15:22.155 "w_mbytes_per_sec": 0 00:15:22.155 }, 00:15:22.155 "claimed": true, 00:15:22.155 "claim_type": "exclusive_write", 00:15:22.155 "zoned": false, 00:15:22.155 "supported_io_types": { 00:15:22.155 "read": true, 00:15:22.155 "write": true, 00:15:22.155 "unmap": true, 00:15:22.155 "write_zeroes": true, 00:15:22.155 "flush": true, 00:15:22.155 "reset": true, 00:15:22.155 "compare": false, 00:15:22.155 "compare_and_write": false, 00:15:22.155 "abort": true, 00:15:22.155 "nvme_admin": false, 00:15:22.155 "nvme_io": false 00:15:22.155 }, 00:15:22.155 "memory_domains": [ 00:15:22.155 { 00:15:22.155 "dma_device_id": "system", 00:15:22.155 "dma_device_type": 1 00:15:22.155 }, 00:15:22.155 { 00:15:22.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.155 "dma_device_type": 2 00:15:22.155 } 00:15:22.155 ], 00:15:22.155 "driver_specific": {} 00:15:22.155 }' 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.155 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:22.414 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:22.673 [2024-05-14 11:51:49.633780] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:22.673 [2024-05-14 11:51:49.633806] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:22.673 [2024-05-14 11:51:49.633855] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.673 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.932 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:22.932 "name": "Existed_Raid", 00:15:22.932 "uuid": "74d770d4-abd1-4ea2-8c9e-3aa5f7e39a0e", 00:15:22.932 "strip_size_kb": 64, 00:15:22.932 "state": "offline", 00:15:22.932 "raid_level": "raid0", 00:15:22.932 "superblock": false, 00:15:22.932 "num_base_bdevs": 4, 00:15:22.932 "num_base_bdevs_discovered": 3, 00:15:22.932 "num_base_bdevs_operational": 3, 00:15:22.932 "base_bdevs_list": [ 00:15:22.932 { 00:15:22.932 "name": null, 00:15:22.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.932 "is_configured": false, 00:15:22.932 "data_offset": 0, 00:15:22.932 "data_size": 65536 00:15:22.932 }, 00:15:22.932 { 00:15:22.932 "name": "BaseBdev2", 00:15:22.932 "uuid": "10006ce6-9f73-4e41-8d5b-f4c5c606e92e", 00:15:22.932 "is_configured": true, 00:15:22.932 "data_offset": 0, 00:15:22.932 "data_size": 65536 00:15:22.932 }, 00:15:22.932 { 00:15:22.932 "name": "BaseBdev3", 00:15:22.932 "uuid": "5de72523-fce3-449d-804f-9e583c5f15f9", 00:15:22.932 "is_configured": true, 00:15:22.932 "data_offset": 0, 00:15:22.932 "data_size": 65536 00:15:22.932 }, 00:15:22.932 { 00:15:22.932 "name": "BaseBdev4", 00:15:22.932 "uuid": "a464a00f-cee6-4828-b363-20a3ebb4458e", 00:15:22.932 "is_configured": true, 00:15:22.932 "data_offset": 0, 00:15:22.932 "data_size": 65536 00:15:22.932 } 00:15:22.932 ] 00:15:22.932 }' 00:15:22.932 11:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:22.932 11:51:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.500 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:23.500 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:23.500 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:23.500 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.760 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:23.760 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:23.760 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:24.019 [2024-05-14 11:51:50.955207] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:24.019 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:24.019 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:24.019 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.019 11:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:24.278 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:24.278 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:24.278 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:24.537 [2024-05-14 11:51:51.444883] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:24.537 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:24.537 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:24.537 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.537 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:24.798 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:24.798 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:24.798 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:25.058 [2024-05-14 11:51:51.938512] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:25.058 [2024-05-14 11:51:51.938561] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a511b0 name Existed_Raid, state offline 00:15:25.058 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:25.058 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:25.058 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.058 11:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:25.320 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:25.579 BaseBdev2 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:25.579 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:25.837 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:25.837 [ 00:15:25.837 { 00:15:25.837 "name": "BaseBdev2", 00:15:25.837 "aliases": [ 00:15:25.837 "13618242-c908-4914-a621-516a6dbcf847" 00:15:25.837 ], 00:15:25.837 "product_name": "Malloc disk", 00:15:25.837 "block_size": 512, 00:15:25.837 "num_blocks": 65536, 00:15:25.837 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:25.837 "assigned_rate_limits": { 00:15:25.837 "rw_ios_per_sec": 0, 00:15:25.837 "rw_mbytes_per_sec": 0, 00:15:25.837 "r_mbytes_per_sec": 0, 00:15:25.837 "w_mbytes_per_sec": 0 00:15:25.837 }, 00:15:25.837 "claimed": false, 00:15:25.837 "zoned": false, 00:15:25.837 "supported_io_types": { 00:15:25.837 "read": true, 00:15:25.837 "write": true, 00:15:25.837 "unmap": true, 00:15:25.837 "write_zeroes": true, 00:15:25.837 "flush": true, 00:15:25.837 "reset": true, 00:15:25.837 "compare": false, 00:15:25.837 "compare_and_write": false, 00:15:25.837 "abort": true, 00:15:25.837 "nvme_admin": false, 00:15:25.837 "nvme_io": false 00:15:25.837 }, 00:15:25.837 "memory_domains": [ 00:15:25.837 { 00:15:25.837 "dma_device_id": "system", 00:15:25.837 "dma_device_type": 1 00:15:25.837 }, 00:15:25.837 { 00:15:25.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.837 "dma_device_type": 2 00:15:25.837 } 00:15:25.837 ], 00:15:25.837 "driver_specific": {} 00:15:25.837 } 00:15:25.837 ] 00:15:25.837 11:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:25.837 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:25.837 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:25.837 11:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:26.095 BaseBdev3 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:26.095 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.353 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:26.612 [ 00:15:26.612 { 00:15:26.612 "name": "BaseBdev3", 00:15:26.612 "aliases": [ 00:15:26.612 "70f3a552-054d-4ea2-8513-03ee054d8aae" 00:15:26.612 ], 00:15:26.612 "product_name": "Malloc disk", 00:15:26.612 "block_size": 512, 00:15:26.612 "num_blocks": 65536, 00:15:26.612 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:26.612 "assigned_rate_limits": { 00:15:26.612 "rw_ios_per_sec": 0, 00:15:26.612 "rw_mbytes_per_sec": 0, 00:15:26.612 "r_mbytes_per_sec": 0, 00:15:26.612 "w_mbytes_per_sec": 0 00:15:26.612 }, 00:15:26.612 "claimed": false, 00:15:26.612 "zoned": false, 00:15:26.612 "supported_io_types": { 00:15:26.612 "read": true, 00:15:26.612 "write": true, 00:15:26.612 "unmap": true, 00:15:26.612 "write_zeroes": true, 00:15:26.612 "flush": true, 00:15:26.612 "reset": true, 00:15:26.612 "compare": false, 00:15:26.612 "compare_and_write": false, 00:15:26.612 "abort": true, 00:15:26.612 "nvme_admin": false, 00:15:26.612 "nvme_io": false 00:15:26.612 }, 00:15:26.612 "memory_domains": [ 00:15:26.612 { 00:15:26.612 "dma_device_id": "system", 00:15:26.612 "dma_device_type": 1 00:15:26.612 }, 00:15:26.612 { 00:15:26.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.612 "dma_device_type": 2 00:15:26.612 } 00:15:26.612 ], 00:15:26.612 "driver_specific": {} 00:15:26.612 } 00:15:26.612 ] 00:15:26.612 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:26.612 11:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:26.612 11:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:26.612 11:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:26.871 BaseBdev4 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:26.871 11:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.131 11:51:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:27.390 [ 00:15:27.390 { 00:15:27.390 "name": "BaseBdev4", 00:15:27.390 "aliases": [ 00:15:27.390 "8a7e8467-e777-481d-87da-52f8e386d87d" 00:15:27.390 ], 00:15:27.390 "product_name": "Malloc disk", 00:15:27.390 "block_size": 512, 00:15:27.390 "num_blocks": 65536, 00:15:27.390 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:27.390 "assigned_rate_limits": { 00:15:27.390 "rw_ios_per_sec": 0, 00:15:27.390 "rw_mbytes_per_sec": 0, 00:15:27.390 "r_mbytes_per_sec": 0, 00:15:27.390 "w_mbytes_per_sec": 0 00:15:27.390 }, 00:15:27.390 "claimed": false, 00:15:27.390 "zoned": false, 00:15:27.390 "supported_io_types": { 00:15:27.390 "read": true, 00:15:27.390 "write": true, 00:15:27.390 "unmap": true, 00:15:27.390 "write_zeroes": true, 00:15:27.390 "flush": true, 00:15:27.390 "reset": true, 00:15:27.390 "compare": false, 00:15:27.390 "compare_and_write": false, 00:15:27.390 "abort": true, 00:15:27.390 "nvme_admin": false, 00:15:27.390 "nvme_io": false 00:15:27.390 }, 00:15:27.390 "memory_domains": [ 00:15:27.390 { 00:15:27.390 "dma_device_id": "system", 00:15:27.390 "dma_device_type": 1 00:15:27.390 }, 00:15:27.390 { 00:15:27.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.390 "dma_device_type": 2 00:15:27.390 } 00:15:27.390 ], 00:15:27.390 "driver_specific": {} 00:15:27.390 } 00:15:27.390 ] 00:15:27.390 11:51:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:27.390 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:27.390 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:27.390 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:27.648 [2024-05-14 11:51:54.578586] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:27.648 [2024-05-14 11:51:54.578627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:27.648 [2024-05-14 11:51:54.578647] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:27.648 [2024-05-14 11:51:54.580016] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:27.648 [2024-05-14 11:51:54.580058] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:27.648 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:27.648 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:27.648 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:27.648 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:27.648 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.649 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.907 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:27.907 "name": "Existed_Raid", 00:15:27.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.907 "strip_size_kb": 64, 00:15:27.907 "state": "configuring", 00:15:27.907 "raid_level": "raid0", 00:15:27.907 "superblock": false, 00:15:27.907 "num_base_bdevs": 4, 00:15:27.907 "num_base_bdevs_discovered": 3, 00:15:27.907 "num_base_bdevs_operational": 4, 00:15:27.907 "base_bdevs_list": [ 00:15:27.907 { 00:15:27.907 "name": "BaseBdev1", 00:15:27.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.907 "is_configured": false, 00:15:27.907 "data_offset": 0, 00:15:27.907 "data_size": 0 00:15:27.907 }, 00:15:27.907 { 00:15:27.907 "name": "BaseBdev2", 00:15:27.907 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:27.907 "is_configured": true, 00:15:27.907 "data_offset": 0, 00:15:27.907 "data_size": 65536 00:15:27.907 }, 00:15:27.907 { 00:15:27.907 "name": "BaseBdev3", 00:15:27.907 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:27.907 "is_configured": true, 00:15:27.907 "data_offset": 0, 00:15:27.907 "data_size": 65536 00:15:27.907 }, 00:15:27.907 { 00:15:27.907 "name": "BaseBdev4", 00:15:27.907 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:27.907 "is_configured": true, 00:15:27.907 "data_offset": 0, 00:15:27.907 "data_size": 65536 00:15:27.907 } 00:15:27.907 ] 00:15:27.907 }' 00:15:27.907 11:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:27.907 11:51:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.474 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:28.733 [2024-05-14 11:51:55.601261] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.733 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.992 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:28.992 "name": "Existed_Raid", 00:15:28.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.992 "strip_size_kb": 64, 00:15:28.992 "state": "configuring", 00:15:28.992 "raid_level": "raid0", 00:15:28.992 "superblock": false, 00:15:28.992 "num_base_bdevs": 4, 00:15:28.992 "num_base_bdevs_discovered": 2, 00:15:28.992 "num_base_bdevs_operational": 4, 00:15:28.992 "base_bdevs_list": [ 00:15:28.992 { 00:15:28.992 "name": "BaseBdev1", 00:15:28.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.992 "is_configured": false, 00:15:28.992 "data_offset": 0, 00:15:28.992 "data_size": 0 00:15:28.992 }, 00:15:28.992 { 00:15:28.992 "name": null, 00:15:28.992 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:28.992 "is_configured": false, 00:15:28.992 "data_offset": 0, 00:15:28.992 "data_size": 65536 00:15:28.992 }, 00:15:28.992 { 00:15:28.992 "name": "BaseBdev3", 00:15:28.992 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:28.992 "is_configured": true, 00:15:28.992 "data_offset": 0, 00:15:28.992 "data_size": 65536 00:15:28.992 }, 00:15:28.992 { 00:15:28.992 "name": "BaseBdev4", 00:15:28.992 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:28.992 "is_configured": true, 00:15:28.992 "data_offset": 0, 00:15:28.992 "data_size": 65536 00:15:28.992 } 00:15:28.992 ] 00:15:28.992 }' 00:15:28.992 11:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:28.992 11:51:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.560 11:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.560 11:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:29.818 11:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:15:29.818 11:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:30.077 [2024-05-14 11:51:56.948258] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:30.077 BaseBdev1 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:30.077 11:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.335 11:51:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:30.593 [ 00:15:30.593 { 00:15:30.593 "name": "BaseBdev1", 00:15:30.593 "aliases": [ 00:15:30.593 "8d194732-0a07-4847-8e91-7f49ec784267" 00:15:30.593 ], 00:15:30.593 "product_name": "Malloc disk", 00:15:30.593 "block_size": 512, 00:15:30.593 "num_blocks": 65536, 00:15:30.593 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:30.593 "assigned_rate_limits": { 00:15:30.593 "rw_ios_per_sec": 0, 00:15:30.593 "rw_mbytes_per_sec": 0, 00:15:30.593 "r_mbytes_per_sec": 0, 00:15:30.593 "w_mbytes_per_sec": 0 00:15:30.593 }, 00:15:30.593 "claimed": true, 00:15:30.593 "claim_type": "exclusive_write", 00:15:30.593 "zoned": false, 00:15:30.593 "supported_io_types": { 00:15:30.593 "read": true, 00:15:30.593 "write": true, 00:15:30.593 "unmap": true, 00:15:30.593 "write_zeroes": true, 00:15:30.593 "flush": true, 00:15:30.593 "reset": true, 00:15:30.593 "compare": false, 00:15:30.593 "compare_and_write": false, 00:15:30.593 "abort": true, 00:15:30.593 "nvme_admin": false, 00:15:30.593 "nvme_io": false 00:15:30.593 }, 00:15:30.593 "memory_domains": [ 00:15:30.593 { 00:15:30.593 "dma_device_id": "system", 00:15:30.593 "dma_device_type": 1 00:15:30.593 }, 00:15:30.593 { 00:15:30.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.593 "dma_device_type": 2 00:15:30.593 } 00:15:30.593 ], 00:15:30.593 "driver_specific": {} 00:15:30.593 } 00:15:30.593 ] 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.593 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:30.593 "name": "Existed_Raid", 00:15:30.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.593 "strip_size_kb": 64, 00:15:30.593 "state": "configuring", 00:15:30.593 "raid_level": "raid0", 00:15:30.593 "superblock": false, 00:15:30.593 "num_base_bdevs": 4, 00:15:30.593 "num_base_bdevs_discovered": 3, 00:15:30.593 "num_base_bdevs_operational": 4, 00:15:30.593 "base_bdevs_list": [ 00:15:30.593 { 00:15:30.593 "name": "BaseBdev1", 00:15:30.593 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:30.593 "is_configured": true, 00:15:30.593 "data_offset": 0, 00:15:30.593 "data_size": 65536 00:15:30.593 }, 00:15:30.593 { 00:15:30.593 "name": null, 00:15:30.593 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:30.593 "is_configured": false, 00:15:30.593 "data_offset": 0, 00:15:30.593 "data_size": 65536 00:15:30.593 }, 00:15:30.593 { 00:15:30.593 "name": "BaseBdev3", 00:15:30.593 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:30.593 "is_configured": true, 00:15:30.593 "data_offset": 0, 00:15:30.593 "data_size": 65536 00:15:30.593 }, 00:15:30.594 { 00:15:30.594 "name": "BaseBdev4", 00:15:30.594 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:30.594 "is_configured": true, 00:15:30.594 "data_offset": 0, 00:15:30.594 "data_size": 65536 00:15:30.594 } 00:15:30.594 ] 00:15:30.594 }' 00:15:30.594 11:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:30.594 11:51:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.160 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.160 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:31.418 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:15:31.418 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:31.676 [2024-05-14 11:51:58.644775] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.676 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.936 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:31.936 "name": "Existed_Raid", 00:15:31.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.936 "strip_size_kb": 64, 00:15:31.936 "state": "configuring", 00:15:31.936 "raid_level": "raid0", 00:15:31.936 "superblock": false, 00:15:31.936 "num_base_bdevs": 4, 00:15:31.936 "num_base_bdevs_discovered": 2, 00:15:31.936 "num_base_bdevs_operational": 4, 00:15:31.936 "base_bdevs_list": [ 00:15:31.936 { 00:15:31.936 "name": "BaseBdev1", 00:15:31.936 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:31.936 "is_configured": true, 00:15:31.936 "data_offset": 0, 00:15:31.936 "data_size": 65536 00:15:31.936 }, 00:15:31.936 { 00:15:31.936 "name": null, 00:15:31.936 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:31.936 "is_configured": false, 00:15:31.936 "data_offset": 0, 00:15:31.936 "data_size": 65536 00:15:31.936 }, 00:15:31.936 { 00:15:31.936 "name": null, 00:15:31.936 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:31.936 "is_configured": false, 00:15:31.936 "data_offset": 0, 00:15:31.936 "data_size": 65536 00:15:31.936 }, 00:15:31.936 { 00:15:31.936 "name": "BaseBdev4", 00:15:31.936 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:31.936 "is_configured": true, 00:15:31.936 "data_offset": 0, 00:15:31.936 "data_size": 65536 00:15:31.936 } 00:15:31.936 ] 00:15:31.936 }' 00:15:31.936 11:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:31.936 11:51:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.504 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.504 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:32.763 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:15:32.763 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:33.023 [2024-05-14 11:51:59.924181] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.023 11:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.282 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:33.282 "name": "Existed_Raid", 00:15:33.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.282 "strip_size_kb": 64, 00:15:33.282 "state": "configuring", 00:15:33.282 "raid_level": "raid0", 00:15:33.282 "superblock": false, 00:15:33.282 "num_base_bdevs": 4, 00:15:33.282 "num_base_bdevs_discovered": 3, 00:15:33.282 "num_base_bdevs_operational": 4, 00:15:33.282 "base_bdevs_list": [ 00:15:33.282 { 00:15:33.282 "name": "BaseBdev1", 00:15:33.282 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:33.282 "is_configured": true, 00:15:33.282 "data_offset": 0, 00:15:33.282 "data_size": 65536 00:15:33.282 }, 00:15:33.282 { 00:15:33.282 "name": null, 00:15:33.282 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:33.282 "is_configured": false, 00:15:33.282 "data_offset": 0, 00:15:33.282 "data_size": 65536 00:15:33.282 }, 00:15:33.282 { 00:15:33.282 "name": "BaseBdev3", 00:15:33.282 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:33.282 "is_configured": true, 00:15:33.282 "data_offset": 0, 00:15:33.282 "data_size": 65536 00:15:33.282 }, 00:15:33.282 { 00:15:33.282 "name": "BaseBdev4", 00:15:33.282 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:33.282 "is_configured": true, 00:15:33.282 "data_offset": 0, 00:15:33.282 "data_size": 65536 00:15:33.282 } 00:15:33.282 ] 00:15:33.282 }' 00:15:33.282 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:33.282 11:52:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.850 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.850 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:33.850 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:15:33.850 11:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:34.110 [2024-05-14 11:52:01.067239] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.110 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.369 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:34.369 "name": "Existed_Raid", 00:15:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.369 "strip_size_kb": 64, 00:15:34.369 "state": "configuring", 00:15:34.369 "raid_level": "raid0", 00:15:34.369 "superblock": false, 00:15:34.369 "num_base_bdevs": 4, 00:15:34.369 "num_base_bdevs_discovered": 2, 00:15:34.369 "num_base_bdevs_operational": 4, 00:15:34.369 "base_bdevs_list": [ 00:15:34.369 { 00:15:34.369 "name": null, 00:15:34.369 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:34.369 "is_configured": false, 00:15:34.369 "data_offset": 0, 00:15:34.369 "data_size": 65536 00:15:34.369 }, 00:15:34.369 { 00:15:34.369 "name": null, 00:15:34.369 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:34.369 "is_configured": false, 00:15:34.369 "data_offset": 0, 00:15:34.369 "data_size": 65536 00:15:34.369 }, 00:15:34.369 { 00:15:34.369 "name": "BaseBdev3", 00:15:34.369 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:34.369 "is_configured": true, 00:15:34.369 "data_offset": 0, 00:15:34.369 "data_size": 65536 00:15:34.369 }, 00:15:34.369 { 00:15:34.369 "name": "BaseBdev4", 00:15:34.369 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:34.369 "is_configured": true, 00:15:34.369 "data_offset": 0, 00:15:34.369 "data_size": 65536 00:15:34.369 } 00:15:34.369 ] 00:15:34.369 }' 00:15:34.369 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:34.369 11:52:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.937 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.937 11:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:35.195 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:15:35.195 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:35.484 [2024-05-14 11:52:02.399074] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.484 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.743 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:35.743 "name": "Existed_Raid", 00:15:35.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.743 "strip_size_kb": 64, 00:15:35.743 "state": "configuring", 00:15:35.743 "raid_level": "raid0", 00:15:35.743 "superblock": false, 00:15:35.743 "num_base_bdevs": 4, 00:15:35.743 "num_base_bdevs_discovered": 3, 00:15:35.743 "num_base_bdevs_operational": 4, 00:15:35.743 "base_bdevs_list": [ 00:15:35.743 { 00:15:35.743 "name": null, 00:15:35.743 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:35.743 "is_configured": false, 00:15:35.743 "data_offset": 0, 00:15:35.743 "data_size": 65536 00:15:35.743 }, 00:15:35.743 { 00:15:35.743 "name": "BaseBdev2", 00:15:35.743 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:35.743 "is_configured": true, 00:15:35.743 "data_offset": 0, 00:15:35.743 "data_size": 65536 00:15:35.743 }, 00:15:35.743 { 00:15:35.743 "name": "BaseBdev3", 00:15:35.743 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:35.743 "is_configured": true, 00:15:35.743 "data_offset": 0, 00:15:35.743 "data_size": 65536 00:15:35.743 }, 00:15:35.743 { 00:15:35.743 "name": "BaseBdev4", 00:15:35.743 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:35.743 "is_configured": true, 00:15:35.743 "data_offset": 0, 00:15:35.743 "data_size": 65536 00:15:35.743 } 00:15:35.743 ] 00:15:35.743 }' 00:15:35.743 11:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:35.743 11:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.311 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.311 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:36.569 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:15:36.569 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.569 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:36.829 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8d194732-0a07-4847-8e91-7f49ec784267 00:15:37.088 [2024-05-14 11:52:03.950561] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:37.088 [2024-05-14 11:52:03.950601] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a4a060 00:15:37.088 [2024-05-14 11:52:03.950610] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:37.088 [2024-05-14 11:52:03.950806] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a51800 00:15:37.088 [2024-05-14 11:52:03.950928] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a4a060 00:15:37.088 [2024-05-14 11:52:03.950938] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a4a060 00:15:37.088 [2024-05-14 11:52:03.951106] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.088 NewBaseBdev 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:37.088 11:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.347 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:37.347 [ 00:15:37.347 { 00:15:37.347 "name": "NewBaseBdev", 00:15:37.347 "aliases": [ 00:15:37.347 "8d194732-0a07-4847-8e91-7f49ec784267" 00:15:37.347 ], 00:15:37.347 "product_name": "Malloc disk", 00:15:37.347 "block_size": 512, 00:15:37.347 "num_blocks": 65536, 00:15:37.347 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:37.347 "assigned_rate_limits": { 00:15:37.347 "rw_ios_per_sec": 0, 00:15:37.347 "rw_mbytes_per_sec": 0, 00:15:37.347 "r_mbytes_per_sec": 0, 00:15:37.347 "w_mbytes_per_sec": 0 00:15:37.347 }, 00:15:37.347 "claimed": true, 00:15:37.347 "claim_type": "exclusive_write", 00:15:37.347 "zoned": false, 00:15:37.347 "supported_io_types": { 00:15:37.347 "read": true, 00:15:37.347 "write": true, 00:15:37.347 "unmap": true, 00:15:37.347 "write_zeroes": true, 00:15:37.347 "flush": true, 00:15:37.347 "reset": true, 00:15:37.347 "compare": false, 00:15:37.347 "compare_and_write": false, 00:15:37.347 "abort": true, 00:15:37.347 "nvme_admin": false, 00:15:37.347 "nvme_io": false 00:15:37.347 }, 00:15:37.347 "memory_domains": [ 00:15:37.347 { 00:15:37.347 "dma_device_id": "system", 00:15:37.347 "dma_device_type": 1 00:15:37.347 }, 00:15:37.347 { 00:15:37.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.347 "dma_device_type": 2 00:15:37.347 } 00:15:37.347 ], 00:15:37.347 "driver_specific": {} 00:15:37.347 } 00:15:37.347 ] 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:37.606 "name": "Existed_Raid", 00:15:37.606 "uuid": "36dfa93b-6a70-4a44-8257-65a62d04ba8c", 00:15:37.606 "strip_size_kb": 64, 00:15:37.606 "state": "online", 00:15:37.606 "raid_level": "raid0", 00:15:37.606 "superblock": false, 00:15:37.606 "num_base_bdevs": 4, 00:15:37.606 "num_base_bdevs_discovered": 4, 00:15:37.606 "num_base_bdevs_operational": 4, 00:15:37.606 "base_bdevs_list": [ 00:15:37.606 { 00:15:37.606 "name": "NewBaseBdev", 00:15:37.606 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:37.606 "is_configured": true, 00:15:37.606 "data_offset": 0, 00:15:37.606 "data_size": 65536 00:15:37.606 }, 00:15:37.606 { 00:15:37.606 "name": "BaseBdev2", 00:15:37.606 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:37.606 "is_configured": true, 00:15:37.606 "data_offset": 0, 00:15:37.606 "data_size": 65536 00:15:37.606 }, 00:15:37.606 { 00:15:37.606 "name": "BaseBdev3", 00:15:37.606 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:37.606 "is_configured": true, 00:15:37.606 "data_offset": 0, 00:15:37.606 "data_size": 65536 00:15:37.606 }, 00:15:37.606 { 00:15:37.606 "name": "BaseBdev4", 00:15:37.606 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:37.606 "is_configured": true, 00:15:37.606 "data_offset": 0, 00:15:37.606 "data_size": 65536 00:15:37.606 } 00:15:37.606 ] 00:15:37.606 }' 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:37.606 11:52:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:38.544 [2024-05-14 11:52:05.418801] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:38.544 "name": "Existed_Raid", 00:15:38.544 "aliases": [ 00:15:38.544 "36dfa93b-6a70-4a44-8257-65a62d04ba8c" 00:15:38.544 ], 00:15:38.544 "product_name": "Raid Volume", 00:15:38.544 "block_size": 512, 00:15:38.544 "num_blocks": 262144, 00:15:38.544 "uuid": "36dfa93b-6a70-4a44-8257-65a62d04ba8c", 00:15:38.544 "assigned_rate_limits": { 00:15:38.544 "rw_ios_per_sec": 0, 00:15:38.544 "rw_mbytes_per_sec": 0, 00:15:38.544 "r_mbytes_per_sec": 0, 00:15:38.544 "w_mbytes_per_sec": 0 00:15:38.544 }, 00:15:38.544 "claimed": false, 00:15:38.544 "zoned": false, 00:15:38.544 "supported_io_types": { 00:15:38.544 "read": true, 00:15:38.544 "write": true, 00:15:38.544 "unmap": true, 00:15:38.544 "write_zeroes": true, 00:15:38.544 "flush": true, 00:15:38.544 "reset": true, 00:15:38.544 "compare": false, 00:15:38.544 "compare_and_write": false, 00:15:38.544 "abort": false, 00:15:38.544 "nvme_admin": false, 00:15:38.544 "nvme_io": false 00:15:38.544 }, 00:15:38.544 "memory_domains": [ 00:15:38.544 { 00:15:38.544 "dma_device_id": "system", 00:15:38.544 "dma_device_type": 1 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.544 "dma_device_type": 2 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "system", 00:15:38.544 "dma_device_type": 1 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.544 "dma_device_type": 2 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "system", 00:15:38.544 "dma_device_type": 1 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.544 "dma_device_type": 2 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "system", 00:15:38.544 "dma_device_type": 1 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.544 "dma_device_type": 2 00:15:38.544 } 00:15:38.544 ], 00:15:38.544 "driver_specific": { 00:15:38.544 "raid": { 00:15:38.544 "uuid": "36dfa93b-6a70-4a44-8257-65a62d04ba8c", 00:15:38.544 "strip_size_kb": 64, 00:15:38.544 "state": "online", 00:15:38.544 "raid_level": "raid0", 00:15:38.544 "superblock": false, 00:15:38.544 "num_base_bdevs": 4, 00:15:38.544 "num_base_bdevs_discovered": 4, 00:15:38.544 "num_base_bdevs_operational": 4, 00:15:38.544 "base_bdevs_list": [ 00:15:38.544 { 00:15:38.544 "name": "NewBaseBdev", 00:15:38.544 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:38.544 "is_configured": true, 00:15:38.544 "data_offset": 0, 00:15:38.544 "data_size": 65536 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "name": "BaseBdev2", 00:15:38.544 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:38.544 "is_configured": true, 00:15:38.544 "data_offset": 0, 00:15:38.544 "data_size": 65536 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "name": "BaseBdev3", 00:15:38.544 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:38.544 "is_configured": true, 00:15:38.544 "data_offset": 0, 00:15:38.544 "data_size": 65536 00:15:38.544 }, 00:15:38.544 { 00:15:38.544 "name": "BaseBdev4", 00:15:38.544 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:38.544 "is_configured": true, 00:15:38.544 "data_offset": 0, 00:15:38.544 "data_size": 65536 00:15:38.544 } 00:15:38.544 ] 00:15:38.544 } 00:15:38.544 } 00:15:38.544 }' 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:15:38.544 BaseBdev2 00:15:38.544 BaseBdev3 00:15:38.544 BaseBdev4' 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:38.544 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:38.804 "name": "NewBaseBdev", 00:15:38.804 "aliases": [ 00:15:38.804 "8d194732-0a07-4847-8e91-7f49ec784267" 00:15:38.804 ], 00:15:38.804 "product_name": "Malloc disk", 00:15:38.804 "block_size": 512, 00:15:38.804 "num_blocks": 65536, 00:15:38.804 "uuid": "8d194732-0a07-4847-8e91-7f49ec784267", 00:15:38.804 "assigned_rate_limits": { 00:15:38.804 "rw_ios_per_sec": 0, 00:15:38.804 "rw_mbytes_per_sec": 0, 00:15:38.804 "r_mbytes_per_sec": 0, 00:15:38.804 "w_mbytes_per_sec": 0 00:15:38.804 }, 00:15:38.804 "claimed": true, 00:15:38.804 "claim_type": "exclusive_write", 00:15:38.804 "zoned": false, 00:15:38.804 "supported_io_types": { 00:15:38.804 "read": true, 00:15:38.804 "write": true, 00:15:38.804 "unmap": true, 00:15:38.804 "write_zeroes": true, 00:15:38.804 "flush": true, 00:15:38.804 "reset": true, 00:15:38.804 "compare": false, 00:15:38.804 "compare_and_write": false, 00:15:38.804 "abort": true, 00:15:38.804 "nvme_admin": false, 00:15:38.804 "nvme_io": false 00:15:38.804 }, 00:15:38.804 "memory_domains": [ 00:15:38.804 { 00:15:38.804 "dma_device_id": "system", 00:15:38.804 "dma_device_type": 1 00:15:38.804 }, 00:15:38.804 { 00:15:38.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.804 "dma_device_type": 2 00:15:38.804 } 00:15:38.804 ], 00:15:38.804 "driver_specific": {} 00:15:38.804 }' 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:38.804 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:39.064 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.064 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:39.064 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:39.064 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.064 11:52:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:39.064 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:39.064 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:39.064 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:39.064 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:39.064 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:39.324 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:39.324 "name": "BaseBdev2", 00:15:39.324 "aliases": [ 00:15:39.324 "13618242-c908-4914-a621-516a6dbcf847" 00:15:39.324 ], 00:15:39.324 "product_name": "Malloc disk", 00:15:39.324 "block_size": 512, 00:15:39.324 "num_blocks": 65536, 00:15:39.324 "uuid": "13618242-c908-4914-a621-516a6dbcf847", 00:15:39.324 "assigned_rate_limits": { 00:15:39.324 "rw_ios_per_sec": 0, 00:15:39.324 "rw_mbytes_per_sec": 0, 00:15:39.324 "r_mbytes_per_sec": 0, 00:15:39.324 "w_mbytes_per_sec": 0 00:15:39.324 }, 00:15:39.324 "claimed": true, 00:15:39.324 "claim_type": "exclusive_write", 00:15:39.324 "zoned": false, 00:15:39.324 "supported_io_types": { 00:15:39.324 "read": true, 00:15:39.324 "write": true, 00:15:39.324 "unmap": true, 00:15:39.324 "write_zeroes": true, 00:15:39.324 "flush": true, 00:15:39.324 "reset": true, 00:15:39.324 "compare": false, 00:15:39.324 "compare_and_write": false, 00:15:39.324 "abort": true, 00:15:39.324 "nvme_admin": false, 00:15:39.324 "nvme_io": false 00:15:39.324 }, 00:15:39.324 "memory_domains": [ 00:15:39.324 { 00:15:39.324 "dma_device_id": "system", 00:15:39.324 "dma_device_type": 1 00:15:39.324 }, 00:15:39.324 { 00:15:39.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.324 "dma_device_type": 2 00:15:39.324 } 00:15:39.324 ], 00:15:39.324 "driver_specific": {} 00:15:39.324 }' 00:15:39.324 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:39.324 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:39.583 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:39.842 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:39.842 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:39.842 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:39.842 "name": "BaseBdev3", 00:15:39.842 "aliases": [ 00:15:39.842 "70f3a552-054d-4ea2-8513-03ee054d8aae" 00:15:39.842 ], 00:15:39.842 "product_name": "Malloc disk", 00:15:39.842 "block_size": 512, 00:15:39.842 "num_blocks": 65536, 00:15:39.842 "uuid": "70f3a552-054d-4ea2-8513-03ee054d8aae", 00:15:39.842 "assigned_rate_limits": { 00:15:39.842 "rw_ios_per_sec": 0, 00:15:39.842 "rw_mbytes_per_sec": 0, 00:15:39.842 "r_mbytes_per_sec": 0, 00:15:39.842 "w_mbytes_per_sec": 0 00:15:39.842 }, 00:15:39.842 "claimed": true, 00:15:39.842 "claim_type": "exclusive_write", 00:15:39.842 "zoned": false, 00:15:39.842 "supported_io_types": { 00:15:39.842 "read": true, 00:15:39.842 "write": true, 00:15:39.842 "unmap": true, 00:15:39.842 "write_zeroes": true, 00:15:39.842 "flush": true, 00:15:39.842 "reset": true, 00:15:39.842 "compare": false, 00:15:39.842 "compare_and_write": false, 00:15:39.842 "abort": true, 00:15:39.842 "nvme_admin": false, 00:15:39.842 "nvme_io": false 00:15:39.842 }, 00:15:39.842 "memory_domains": [ 00:15:39.842 { 00:15:39.842 "dma_device_id": "system", 00:15:39.842 "dma_device_type": 1 00:15:39.842 }, 00:15:39.842 { 00:15:39.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.842 "dma_device_type": 2 00:15:39.842 } 00:15:39.842 ], 00:15:39.842 "driver_specific": {} 00:15:39.842 }' 00:15:39.842 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:40.102 11:52:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.102 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:40.361 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:40.361 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:40.361 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:40.361 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:40.361 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:40.620 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:40.620 "name": "BaseBdev4", 00:15:40.620 "aliases": [ 00:15:40.620 "8a7e8467-e777-481d-87da-52f8e386d87d" 00:15:40.620 ], 00:15:40.620 "product_name": "Malloc disk", 00:15:40.620 "block_size": 512, 00:15:40.620 "num_blocks": 65536, 00:15:40.620 "uuid": "8a7e8467-e777-481d-87da-52f8e386d87d", 00:15:40.620 "assigned_rate_limits": { 00:15:40.620 "rw_ios_per_sec": 0, 00:15:40.621 "rw_mbytes_per_sec": 0, 00:15:40.621 "r_mbytes_per_sec": 0, 00:15:40.621 "w_mbytes_per_sec": 0 00:15:40.621 }, 00:15:40.621 "claimed": true, 00:15:40.621 "claim_type": "exclusive_write", 00:15:40.621 "zoned": false, 00:15:40.621 "supported_io_types": { 00:15:40.621 "read": true, 00:15:40.621 "write": true, 00:15:40.621 "unmap": true, 00:15:40.621 "write_zeroes": true, 00:15:40.621 "flush": true, 00:15:40.621 "reset": true, 00:15:40.621 "compare": false, 00:15:40.621 "compare_and_write": false, 00:15:40.621 "abort": true, 00:15:40.621 "nvme_admin": false, 00:15:40.621 "nvme_io": false 00:15:40.621 }, 00:15:40.621 "memory_domains": [ 00:15:40.621 { 00:15:40.621 "dma_device_id": "system", 00:15:40.621 "dma_device_type": 1 00:15:40.621 }, 00:15:40.621 { 00:15:40.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.621 "dma_device_type": 2 00:15:40.621 } 00:15:40.621 ], 00:15:40.621 "driver_specific": {} 00:15:40.621 }' 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:40.621 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:40.880 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.880 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:40.880 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:40.880 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:40.880 11:52:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:41.139 [2024-05-14 11:52:08.045476] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:41.139 [2024-05-14 11:52:08.045505] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:41.139 [2024-05-14 11:52:08.045566] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:41.139 [2024-05-14 11:52:08.045631] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:41.139 [2024-05-14 11:52:08.045643] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a4a060 name Existed_Raid, state offline 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1712295 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1712295 ']' 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1712295 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1712295 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:41.139 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:41.140 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1712295' 00:15:41.140 killing process with pid 1712295 00:15:41.140 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1712295 00:15:41.140 [2024-05-14 11:52:08.110469] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:41.140 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1712295 00:15:41.140 [2024-05-14 11:52:08.152880] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:15:41.400 00:15:41.400 real 0m31.747s 00:15:41.400 user 0m58.383s 00:15:41.400 sys 0m5.505s 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.400 ************************************ 00:15:41.400 END TEST raid_state_function_test 00:15:41.400 ************************************ 00:15:41.400 11:52:08 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:15:41.400 11:52:08 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:15:41.400 11:52:08 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:41.400 11:52:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:41.400 ************************************ 00:15:41.400 START TEST raid_state_function_test_sb 00:15:41.400 ************************************ 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 true 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1717019 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1717019' 00:15:41.400 Process raid pid: 1717019 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1717019 /var/tmp/spdk-raid.sock 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1717019 ']' 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:41.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:41.400 11:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:41.660 [2024-05-14 11:52:08.533345] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:15:41.660 [2024-05-14 11:52:08.533416] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:41.660 [2024-05-14 11:52:08.664045] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:41.919 [2024-05-14 11:52:08.771137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:41.919 [2024-05-14 11:52:08.837595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:41.919 [2024-05-14 11:52:08.837631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:42.512 11:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:42.512 11:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:15:42.512 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:42.771 [2024-05-14 11:52:09.673001] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:42.771 [2024-05-14 11:52:09.673041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:42.771 [2024-05-14 11:52:09.673052] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:42.771 [2024-05-14 11:52:09.673064] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:42.771 [2024-05-14 11:52:09.673073] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:42.771 [2024-05-14 11:52:09.673084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:42.771 [2024-05-14 11:52:09.673093] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:42.771 [2024-05-14 11:52:09.673104] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.771 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:43.029 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:43.029 "name": "Existed_Raid", 00:15:43.029 "uuid": "ea1e8e85-6d50-4627-ac63-3a91daba0317", 00:15:43.029 "strip_size_kb": 64, 00:15:43.029 "state": "configuring", 00:15:43.029 "raid_level": "raid0", 00:15:43.029 "superblock": true, 00:15:43.029 "num_base_bdevs": 4, 00:15:43.029 "num_base_bdevs_discovered": 0, 00:15:43.029 "num_base_bdevs_operational": 4, 00:15:43.029 "base_bdevs_list": [ 00:15:43.029 { 00:15:43.029 "name": "BaseBdev1", 00:15:43.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.029 "is_configured": false, 00:15:43.029 "data_offset": 0, 00:15:43.029 "data_size": 0 00:15:43.029 }, 00:15:43.029 { 00:15:43.029 "name": "BaseBdev2", 00:15:43.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.029 "is_configured": false, 00:15:43.029 "data_offset": 0, 00:15:43.029 "data_size": 0 00:15:43.029 }, 00:15:43.029 { 00:15:43.029 "name": "BaseBdev3", 00:15:43.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.029 "is_configured": false, 00:15:43.029 "data_offset": 0, 00:15:43.029 "data_size": 0 00:15:43.029 }, 00:15:43.029 { 00:15:43.029 "name": "BaseBdev4", 00:15:43.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.029 "is_configured": false, 00:15:43.029 "data_offset": 0, 00:15:43.029 "data_size": 0 00:15:43.029 } 00:15:43.029 ] 00:15:43.029 }' 00:15:43.029 11:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:43.029 11:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:43.593 11:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:43.850 [2024-05-14 11:52:10.731648] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:43.850 [2024-05-14 11:52:10.731679] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19e8720 name Existed_Raid, state configuring 00:15:43.850 11:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:44.108 [2024-05-14 11:52:10.976328] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.108 [2024-05-14 11:52:10.976360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.108 [2024-05-14 11:52:10.976371] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:44.108 [2024-05-14 11:52:10.976382] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:44.108 [2024-05-14 11:52:10.976391] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:44.108 [2024-05-14 11:52:10.976408] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:44.108 [2024-05-14 11:52:10.976417] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:44.108 [2024-05-14 11:52:10.976428] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:44.108 11:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:44.367 [2024-05-14 11:52:11.230916] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:44.367 BaseBdev1 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:44.367 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:44.627 [ 00:15:44.627 { 00:15:44.627 "name": "BaseBdev1", 00:15:44.627 "aliases": [ 00:15:44.627 "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521" 00:15:44.627 ], 00:15:44.627 "product_name": "Malloc disk", 00:15:44.627 "block_size": 512, 00:15:44.627 "num_blocks": 65536, 00:15:44.627 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:44.627 "assigned_rate_limits": { 00:15:44.627 "rw_ios_per_sec": 0, 00:15:44.627 "rw_mbytes_per_sec": 0, 00:15:44.627 "r_mbytes_per_sec": 0, 00:15:44.627 "w_mbytes_per_sec": 0 00:15:44.627 }, 00:15:44.627 "claimed": true, 00:15:44.627 "claim_type": "exclusive_write", 00:15:44.627 "zoned": false, 00:15:44.627 "supported_io_types": { 00:15:44.627 "read": true, 00:15:44.627 "write": true, 00:15:44.627 "unmap": true, 00:15:44.627 "write_zeroes": true, 00:15:44.627 "flush": true, 00:15:44.627 "reset": true, 00:15:44.627 "compare": false, 00:15:44.627 "compare_and_write": false, 00:15:44.627 "abort": true, 00:15:44.627 "nvme_admin": false, 00:15:44.627 "nvme_io": false 00:15:44.627 }, 00:15:44.627 "memory_domains": [ 00:15:44.627 { 00:15:44.627 "dma_device_id": "system", 00:15:44.627 "dma_device_type": 1 00:15:44.627 }, 00:15:44.627 { 00:15:44.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.627 "dma_device_type": 2 00:15:44.627 } 00:15:44.627 ], 00:15:44.627 "driver_specific": {} 00:15:44.627 } 00:15:44.627 ] 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.627 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.886 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:44.886 "name": "Existed_Raid", 00:15:44.886 "uuid": "53a66bd8-6fb6-4778-b312-83632cefb3b6", 00:15:44.886 "strip_size_kb": 64, 00:15:44.886 "state": "configuring", 00:15:44.886 "raid_level": "raid0", 00:15:44.886 "superblock": true, 00:15:44.886 "num_base_bdevs": 4, 00:15:44.886 "num_base_bdevs_discovered": 1, 00:15:44.886 "num_base_bdevs_operational": 4, 00:15:44.886 "base_bdevs_list": [ 00:15:44.886 { 00:15:44.886 "name": "BaseBdev1", 00:15:44.886 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:44.886 "is_configured": true, 00:15:44.886 "data_offset": 2048, 00:15:44.886 "data_size": 63488 00:15:44.886 }, 00:15:44.886 { 00:15:44.886 "name": "BaseBdev2", 00:15:44.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.886 "is_configured": false, 00:15:44.886 "data_offset": 0, 00:15:44.886 "data_size": 0 00:15:44.886 }, 00:15:44.886 { 00:15:44.886 "name": "BaseBdev3", 00:15:44.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.886 "is_configured": false, 00:15:44.886 "data_offset": 0, 00:15:44.886 "data_size": 0 00:15:44.886 }, 00:15:44.886 { 00:15:44.886 "name": "BaseBdev4", 00:15:44.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.886 "is_configured": false, 00:15:44.886 "data_offset": 0, 00:15:44.886 "data_size": 0 00:15:44.886 } 00:15:44.886 ] 00:15:44.886 }' 00:15:44.886 11:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:44.886 11:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:45.454 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:45.713 [2024-05-14 11:52:12.678747] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:45.713 [2024-05-14 11:52:12.678782] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19e7fb0 name Existed_Raid, state configuring 00:15:45.713 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:45.973 [2024-05-14 11:52:12.923450] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.973 [2024-05-14 11:52:12.924975] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:45.973 [2024-05-14 11:52:12.925009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:45.973 [2024-05-14 11:52:12.925020] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:45.973 [2024-05-14 11:52:12.925031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:45.973 [2024-05-14 11:52:12.925040] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:45.973 [2024-05-14 11:52:12.925051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.973 11:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.232 11:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:46.232 "name": "Existed_Raid", 00:15:46.232 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:46.232 "strip_size_kb": 64, 00:15:46.232 "state": "configuring", 00:15:46.232 "raid_level": "raid0", 00:15:46.232 "superblock": true, 00:15:46.232 "num_base_bdevs": 4, 00:15:46.232 "num_base_bdevs_discovered": 1, 00:15:46.232 "num_base_bdevs_operational": 4, 00:15:46.232 "base_bdevs_list": [ 00:15:46.232 { 00:15:46.232 "name": "BaseBdev1", 00:15:46.232 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:46.232 "is_configured": true, 00:15:46.232 "data_offset": 2048, 00:15:46.232 "data_size": 63488 00:15:46.232 }, 00:15:46.232 { 00:15:46.232 "name": "BaseBdev2", 00:15:46.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.232 "is_configured": false, 00:15:46.232 "data_offset": 0, 00:15:46.232 "data_size": 0 00:15:46.232 }, 00:15:46.232 { 00:15:46.232 "name": "BaseBdev3", 00:15:46.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.232 "is_configured": false, 00:15:46.232 "data_offset": 0, 00:15:46.232 "data_size": 0 00:15:46.232 }, 00:15:46.232 { 00:15:46.232 "name": "BaseBdev4", 00:15:46.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.232 "is_configured": false, 00:15:46.232 "data_offset": 0, 00:15:46.232 "data_size": 0 00:15:46.232 } 00:15:46.232 ] 00:15:46.232 }' 00:15:46.232 11:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:46.232 11:52:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.801 11:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:47.061 [2024-05-14 11:52:14.029651] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:47.061 BaseBdev2 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:47.061 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.320 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:47.580 [ 00:15:47.580 { 00:15:47.580 "name": "BaseBdev2", 00:15:47.580 "aliases": [ 00:15:47.580 "f17475b5-0cca-4f96-b055-410eb3d4a606" 00:15:47.580 ], 00:15:47.580 "product_name": "Malloc disk", 00:15:47.580 "block_size": 512, 00:15:47.580 "num_blocks": 65536, 00:15:47.580 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:47.580 "assigned_rate_limits": { 00:15:47.580 "rw_ios_per_sec": 0, 00:15:47.580 "rw_mbytes_per_sec": 0, 00:15:47.580 "r_mbytes_per_sec": 0, 00:15:47.580 "w_mbytes_per_sec": 0 00:15:47.580 }, 00:15:47.580 "claimed": true, 00:15:47.580 "claim_type": "exclusive_write", 00:15:47.580 "zoned": false, 00:15:47.580 "supported_io_types": { 00:15:47.580 "read": true, 00:15:47.580 "write": true, 00:15:47.580 "unmap": true, 00:15:47.580 "write_zeroes": true, 00:15:47.580 "flush": true, 00:15:47.580 "reset": true, 00:15:47.580 "compare": false, 00:15:47.580 "compare_and_write": false, 00:15:47.580 "abort": true, 00:15:47.580 "nvme_admin": false, 00:15:47.580 "nvme_io": false 00:15:47.580 }, 00:15:47.580 "memory_domains": [ 00:15:47.580 { 00:15:47.580 "dma_device_id": "system", 00:15:47.580 "dma_device_type": 1 00:15:47.580 }, 00:15:47.580 { 00:15:47.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.580 "dma_device_type": 2 00:15:47.580 } 00:15:47.580 ], 00:15:47.580 "driver_specific": {} 00:15:47.580 } 00:15:47.580 ] 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.580 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.840 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:47.840 "name": "Existed_Raid", 00:15:47.840 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:47.840 "strip_size_kb": 64, 00:15:47.840 "state": "configuring", 00:15:47.840 "raid_level": "raid0", 00:15:47.840 "superblock": true, 00:15:47.840 "num_base_bdevs": 4, 00:15:47.840 "num_base_bdevs_discovered": 2, 00:15:47.840 "num_base_bdevs_operational": 4, 00:15:47.840 "base_bdevs_list": [ 00:15:47.840 { 00:15:47.840 "name": "BaseBdev1", 00:15:47.840 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:47.840 "is_configured": true, 00:15:47.840 "data_offset": 2048, 00:15:47.840 "data_size": 63488 00:15:47.840 }, 00:15:47.840 { 00:15:47.840 "name": "BaseBdev2", 00:15:47.840 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:47.840 "is_configured": true, 00:15:47.840 "data_offset": 2048, 00:15:47.840 "data_size": 63488 00:15:47.840 }, 00:15:47.840 { 00:15:47.840 "name": "BaseBdev3", 00:15:47.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.840 "is_configured": false, 00:15:47.840 "data_offset": 0, 00:15:47.840 "data_size": 0 00:15:47.840 }, 00:15:47.840 { 00:15:47.841 "name": "BaseBdev4", 00:15:47.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.841 "is_configured": false, 00:15:47.841 "data_offset": 0, 00:15:47.841 "data_size": 0 00:15:47.841 } 00:15:47.841 ] 00:15:47.841 }' 00:15:47.841 11:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:47.841 11:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.408 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:48.667 [2024-05-14 11:52:15.540962] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.667 BaseBdev3 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:48.667 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:48.929 [ 00:15:48.929 { 00:15:48.929 "name": "BaseBdev3", 00:15:48.929 "aliases": [ 00:15:48.929 "23762620-6abb-47cd-be89-34382046be45" 00:15:48.929 ], 00:15:48.929 "product_name": "Malloc disk", 00:15:48.929 "block_size": 512, 00:15:48.929 "num_blocks": 65536, 00:15:48.929 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:48.929 "assigned_rate_limits": { 00:15:48.929 "rw_ios_per_sec": 0, 00:15:48.929 "rw_mbytes_per_sec": 0, 00:15:48.929 "r_mbytes_per_sec": 0, 00:15:48.929 "w_mbytes_per_sec": 0 00:15:48.929 }, 00:15:48.929 "claimed": true, 00:15:48.929 "claim_type": "exclusive_write", 00:15:48.929 "zoned": false, 00:15:48.929 "supported_io_types": { 00:15:48.929 "read": true, 00:15:48.929 "write": true, 00:15:48.929 "unmap": true, 00:15:48.929 "write_zeroes": true, 00:15:48.929 "flush": true, 00:15:48.929 "reset": true, 00:15:48.929 "compare": false, 00:15:48.929 "compare_and_write": false, 00:15:48.929 "abort": true, 00:15:48.929 "nvme_admin": false, 00:15:48.929 "nvme_io": false 00:15:48.929 }, 00:15:48.929 "memory_domains": [ 00:15:48.929 { 00:15:48.929 "dma_device_id": "system", 00:15:48.929 "dma_device_type": 1 00:15:48.929 }, 00:15:48.929 { 00:15:48.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.929 "dma_device_type": 2 00:15:48.929 } 00:15:48.929 ], 00:15:48.929 "driver_specific": {} 00:15:48.929 } 00:15:48.929 ] 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.929 11:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.225 11:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:49.225 "name": "Existed_Raid", 00:15:49.225 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:49.225 "strip_size_kb": 64, 00:15:49.225 "state": "configuring", 00:15:49.225 "raid_level": "raid0", 00:15:49.225 "superblock": true, 00:15:49.225 "num_base_bdevs": 4, 00:15:49.225 "num_base_bdevs_discovered": 3, 00:15:49.225 "num_base_bdevs_operational": 4, 00:15:49.225 "base_bdevs_list": [ 00:15:49.225 { 00:15:49.225 "name": "BaseBdev1", 00:15:49.225 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:49.225 "is_configured": true, 00:15:49.225 "data_offset": 2048, 00:15:49.225 "data_size": 63488 00:15:49.225 }, 00:15:49.225 { 00:15:49.225 "name": "BaseBdev2", 00:15:49.225 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:49.225 "is_configured": true, 00:15:49.225 "data_offset": 2048, 00:15:49.225 "data_size": 63488 00:15:49.225 }, 00:15:49.225 { 00:15:49.225 "name": "BaseBdev3", 00:15:49.225 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:49.225 "is_configured": true, 00:15:49.225 "data_offset": 2048, 00:15:49.225 "data_size": 63488 00:15:49.225 }, 00:15:49.225 { 00:15:49.225 "name": "BaseBdev4", 00:15:49.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.225 "is_configured": false, 00:15:49.225 "data_offset": 0, 00:15:49.225 "data_size": 0 00:15:49.225 } 00:15:49.225 ] 00:15:49.225 }' 00:15:49.225 11:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:49.225 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.808 11:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:50.068 [2024-05-14 11:52:16.928099] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:50.068 [2024-05-14 11:52:16.928266] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19e91b0 00:15:50.068 [2024-05-14 11:52:16.928285] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:50.068 [2024-05-14 11:52:16.928476] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ea860 00:15:50.068 [2024-05-14 11:52:16.928606] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19e91b0 00:15:50.068 [2024-05-14 11:52:16.928616] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19e91b0 00:15:50.068 [2024-05-14 11:52:16.928711] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:50.068 BaseBdev4 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:50.068 11:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:50.068 11:52:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:50.327 [ 00:15:50.327 { 00:15:50.327 "name": "BaseBdev4", 00:15:50.327 "aliases": [ 00:15:50.327 "40bcd81d-7738-4a40-b655-cdada4692d82" 00:15:50.327 ], 00:15:50.327 "product_name": "Malloc disk", 00:15:50.327 "block_size": 512, 00:15:50.327 "num_blocks": 65536, 00:15:50.327 "uuid": "40bcd81d-7738-4a40-b655-cdada4692d82", 00:15:50.327 "assigned_rate_limits": { 00:15:50.327 "rw_ios_per_sec": 0, 00:15:50.327 "rw_mbytes_per_sec": 0, 00:15:50.327 "r_mbytes_per_sec": 0, 00:15:50.327 "w_mbytes_per_sec": 0 00:15:50.327 }, 00:15:50.327 "claimed": true, 00:15:50.327 "claim_type": "exclusive_write", 00:15:50.327 "zoned": false, 00:15:50.327 "supported_io_types": { 00:15:50.327 "read": true, 00:15:50.327 "write": true, 00:15:50.327 "unmap": true, 00:15:50.327 "write_zeroes": true, 00:15:50.327 "flush": true, 00:15:50.327 "reset": true, 00:15:50.327 "compare": false, 00:15:50.327 "compare_and_write": false, 00:15:50.327 "abort": true, 00:15:50.327 "nvme_admin": false, 00:15:50.327 "nvme_io": false 00:15:50.327 }, 00:15:50.327 "memory_domains": [ 00:15:50.327 { 00:15:50.327 "dma_device_id": "system", 00:15:50.327 "dma_device_type": 1 00:15:50.327 }, 00:15:50.327 { 00:15:50.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.327 "dma_device_type": 2 00:15:50.327 } 00:15:50.327 ], 00:15:50.327 "driver_specific": {} 00:15:50.327 } 00:15:50.327 ] 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.327 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.587 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:50.587 "name": "Existed_Raid", 00:15:50.587 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:50.587 "strip_size_kb": 64, 00:15:50.587 "state": "online", 00:15:50.587 "raid_level": "raid0", 00:15:50.587 "superblock": true, 00:15:50.587 "num_base_bdevs": 4, 00:15:50.587 "num_base_bdevs_discovered": 4, 00:15:50.587 "num_base_bdevs_operational": 4, 00:15:50.587 "base_bdevs_list": [ 00:15:50.587 { 00:15:50.587 "name": "BaseBdev1", 00:15:50.587 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:50.587 "is_configured": true, 00:15:50.587 "data_offset": 2048, 00:15:50.587 "data_size": 63488 00:15:50.587 }, 00:15:50.587 { 00:15:50.587 "name": "BaseBdev2", 00:15:50.587 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:50.587 "is_configured": true, 00:15:50.587 "data_offset": 2048, 00:15:50.587 "data_size": 63488 00:15:50.587 }, 00:15:50.587 { 00:15:50.587 "name": "BaseBdev3", 00:15:50.587 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:50.587 "is_configured": true, 00:15:50.587 "data_offset": 2048, 00:15:50.587 "data_size": 63488 00:15:50.587 }, 00:15:50.587 { 00:15:50.587 "name": "BaseBdev4", 00:15:50.587 "uuid": "40bcd81d-7738-4a40-b655-cdada4692d82", 00:15:50.587 "is_configured": true, 00:15:50.587 "data_offset": 2048, 00:15:50.587 "data_size": 63488 00:15:50.587 } 00:15:50.587 ] 00:15:50.587 }' 00:15:50.587 11:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:50.587 11:52:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:51.156 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:51.415 [2024-05-14 11:52:18.308044] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:51.415 "name": "Existed_Raid", 00:15:51.415 "aliases": [ 00:15:51.415 "a2f507ff-9101-4a13-a34c-4b6329b82d7e" 00:15:51.415 ], 00:15:51.415 "product_name": "Raid Volume", 00:15:51.415 "block_size": 512, 00:15:51.415 "num_blocks": 253952, 00:15:51.415 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:51.415 "assigned_rate_limits": { 00:15:51.415 "rw_ios_per_sec": 0, 00:15:51.415 "rw_mbytes_per_sec": 0, 00:15:51.415 "r_mbytes_per_sec": 0, 00:15:51.415 "w_mbytes_per_sec": 0 00:15:51.415 }, 00:15:51.415 "claimed": false, 00:15:51.415 "zoned": false, 00:15:51.415 "supported_io_types": { 00:15:51.415 "read": true, 00:15:51.415 "write": true, 00:15:51.415 "unmap": true, 00:15:51.415 "write_zeroes": true, 00:15:51.415 "flush": true, 00:15:51.415 "reset": true, 00:15:51.415 "compare": false, 00:15:51.415 "compare_and_write": false, 00:15:51.415 "abort": false, 00:15:51.415 "nvme_admin": false, 00:15:51.415 "nvme_io": false 00:15:51.415 }, 00:15:51.415 "memory_domains": [ 00:15:51.415 { 00:15:51.415 "dma_device_id": "system", 00:15:51.415 "dma_device_type": 1 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.415 "dma_device_type": 2 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "system", 00:15:51.415 "dma_device_type": 1 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.415 "dma_device_type": 2 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "system", 00:15:51.415 "dma_device_type": 1 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.415 "dma_device_type": 2 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "system", 00:15:51.415 "dma_device_type": 1 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.415 "dma_device_type": 2 00:15:51.415 } 00:15:51.415 ], 00:15:51.415 "driver_specific": { 00:15:51.415 "raid": { 00:15:51.415 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:51.415 "strip_size_kb": 64, 00:15:51.415 "state": "online", 00:15:51.415 "raid_level": "raid0", 00:15:51.415 "superblock": true, 00:15:51.415 "num_base_bdevs": 4, 00:15:51.415 "num_base_bdevs_discovered": 4, 00:15:51.415 "num_base_bdevs_operational": 4, 00:15:51.415 "base_bdevs_list": [ 00:15:51.415 { 00:15:51.415 "name": "BaseBdev1", 00:15:51.415 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:51.415 "is_configured": true, 00:15:51.415 "data_offset": 2048, 00:15:51.415 "data_size": 63488 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "name": "BaseBdev2", 00:15:51.415 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:51.415 "is_configured": true, 00:15:51.415 "data_offset": 2048, 00:15:51.415 "data_size": 63488 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "name": "BaseBdev3", 00:15:51.415 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:51.415 "is_configured": true, 00:15:51.415 "data_offset": 2048, 00:15:51.415 "data_size": 63488 00:15:51.415 }, 00:15:51.415 { 00:15:51.415 "name": "BaseBdev4", 00:15:51.415 "uuid": "40bcd81d-7738-4a40-b655-cdada4692d82", 00:15:51.415 "is_configured": true, 00:15:51.415 "data_offset": 2048, 00:15:51.415 "data_size": 63488 00:15:51.415 } 00:15:51.415 ] 00:15:51.415 } 00:15:51.415 } 00:15:51.415 }' 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:51.415 BaseBdev2 00:15:51.415 BaseBdev3 00:15:51.415 BaseBdev4' 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:51.415 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:51.674 "name": "BaseBdev1", 00:15:51.674 "aliases": [ 00:15:51.674 "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521" 00:15:51.674 ], 00:15:51.674 "product_name": "Malloc disk", 00:15:51.674 "block_size": 512, 00:15:51.674 "num_blocks": 65536, 00:15:51.674 "uuid": "b00fba7b-9d3c-43ce-a8d4-ebbba6f81521", 00:15:51.674 "assigned_rate_limits": { 00:15:51.674 "rw_ios_per_sec": 0, 00:15:51.674 "rw_mbytes_per_sec": 0, 00:15:51.674 "r_mbytes_per_sec": 0, 00:15:51.674 "w_mbytes_per_sec": 0 00:15:51.674 }, 00:15:51.674 "claimed": true, 00:15:51.674 "claim_type": "exclusive_write", 00:15:51.674 "zoned": false, 00:15:51.674 "supported_io_types": { 00:15:51.674 "read": true, 00:15:51.674 "write": true, 00:15:51.674 "unmap": true, 00:15:51.674 "write_zeroes": true, 00:15:51.674 "flush": true, 00:15:51.674 "reset": true, 00:15:51.674 "compare": false, 00:15:51.674 "compare_and_write": false, 00:15:51.674 "abort": true, 00:15:51.674 "nvme_admin": false, 00:15:51.674 "nvme_io": false 00:15:51.674 }, 00:15:51.674 "memory_domains": [ 00:15:51.674 { 00:15:51.674 "dma_device_id": "system", 00:15:51.674 "dma_device_type": 1 00:15:51.674 }, 00:15:51.674 { 00:15:51.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.674 "dma_device_type": 2 00:15:51.674 } 00:15:51.674 ], 00:15:51.674 "driver_specific": {} 00:15:51.674 }' 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:51.674 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:51.937 11:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:52.196 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:52.196 "name": "BaseBdev2", 00:15:52.196 "aliases": [ 00:15:52.196 "f17475b5-0cca-4f96-b055-410eb3d4a606" 00:15:52.196 ], 00:15:52.196 "product_name": "Malloc disk", 00:15:52.196 "block_size": 512, 00:15:52.196 "num_blocks": 65536, 00:15:52.196 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:52.196 "assigned_rate_limits": { 00:15:52.196 "rw_ios_per_sec": 0, 00:15:52.196 "rw_mbytes_per_sec": 0, 00:15:52.196 "r_mbytes_per_sec": 0, 00:15:52.196 "w_mbytes_per_sec": 0 00:15:52.196 }, 00:15:52.196 "claimed": true, 00:15:52.196 "claim_type": "exclusive_write", 00:15:52.196 "zoned": false, 00:15:52.196 "supported_io_types": { 00:15:52.196 "read": true, 00:15:52.196 "write": true, 00:15:52.196 "unmap": true, 00:15:52.197 "write_zeroes": true, 00:15:52.197 "flush": true, 00:15:52.197 "reset": true, 00:15:52.197 "compare": false, 00:15:52.197 "compare_and_write": false, 00:15:52.197 "abort": true, 00:15:52.197 "nvme_admin": false, 00:15:52.197 "nvme_io": false 00:15:52.197 }, 00:15:52.197 "memory_domains": [ 00:15:52.197 { 00:15:52.197 "dma_device_id": "system", 00:15:52.197 "dma_device_type": 1 00:15:52.197 }, 00:15:52.197 { 00:15:52.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.197 "dma_device_type": 2 00:15:52.197 } 00:15:52.197 ], 00:15:52.197 "driver_specific": {} 00:15:52.197 }' 00:15:52.197 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.197 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:52.456 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:52.715 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:52.715 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:52.715 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:52.715 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:52.974 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:52.974 "name": "BaseBdev3", 00:15:52.974 "aliases": [ 00:15:52.974 "23762620-6abb-47cd-be89-34382046be45" 00:15:52.974 ], 00:15:52.974 "product_name": "Malloc disk", 00:15:52.974 "block_size": 512, 00:15:52.974 "num_blocks": 65536, 00:15:52.974 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:52.974 "assigned_rate_limits": { 00:15:52.974 "rw_ios_per_sec": 0, 00:15:52.974 "rw_mbytes_per_sec": 0, 00:15:52.974 "r_mbytes_per_sec": 0, 00:15:52.974 "w_mbytes_per_sec": 0 00:15:52.974 }, 00:15:52.974 "claimed": true, 00:15:52.974 "claim_type": "exclusive_write", 00:15:52.974 "zoned": false, 00:15:52.974 "supported_io_types": { 00:15:52.974 "read": true, 00:15:52.974 "write": true, 00:15:52.974 "unmap": true, 00:15:52.974 "write_zeroes": true, 00:15:52.974 "flush": true, 00:15:52.974 "reset": true, 00:15:52.974 "compare": false, 00:15:52.974 "compare_and_write": false, 00:15:52.974 "abort": true, 00:15:52.974 "nvme_admin": false, 00:15:52.974 "nvme_io": false 00:15:52.974 }, 00:15:52.974 "memory_domains": [ 00:15:52.974 { 00:15:52.975 "dma_device_id": "system", 00:15:52.975 "dma_device_type": 1 00:15:52.975 }, 00:15:52.975 { 00:15:52.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.975 "dma_device_type": 2 00:15:52.975 } 00:15:52.975 ], 00:15:52.975 "driver_specific": {} 00:15:52.975 }' 00:15:52.975 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.975 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:52.975 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:52.975 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:52.975 11:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:52.975 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.975 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:52.975 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:53.234 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:53.493 "name": "BaseBdev4", 00:15:53.493 "aliases": [ 00:15:53.493 "40bcd81d-7738-4a40-b655-cdada4692d82" 00:15:53.493 ], 00:15:53.493 "product_name": "Malloc disk", 00:15:53.493 "block_size": 512, 00:15:53.493 "num_blocks": 65536, 00:15:53.493 "uuid": "40bcd81d-7738-4a40-b655-cdada4692d82", 00:15:53.493 "assigned_rate_limits": { 00:15:53.493 "rw_ios_per_sec": 0, 00:15:53.493 "rw_mbytes_per_sec": 0, 00:15:53.493 "r_mbytes_per_sec": 0, 00:15:53.493 "w_mbytes_per_sec": 0 00:15:53.493 }, 00:15:53.493 "claimed": true, 00:15:53.493 "claim_type": "exclusive_write", 00:15:53.493 "zoned": false, 00:15:53.493 "supported_io_types": { 00:15:53.493 "read": true, 00:15:53.493 "write": true, 00:15:53.493 "unmap": true, 00:15:53.493 "write_zeroes": true, 00:15:53.493 "flush": true, 00:15:53.493 "reset": true, 00:15:53.493 "compare": false, 00:15:53.493 "compare_and_write": false, 00:15:53.493 "abort": true, 00:15:53.493 "nvme_admin": false, 00:15:53.493 "nvme_io": false 00:15:53.493 }, 00:15:53.493 "memory_domains": [ 00:15:53.493 { 00:15:53.493 "dma_device_id": "system", 00:15:53.493 "dma_device_type": 1 00:15:53.493 }, 00:15:53.493 { 00:15:53.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.493 "dma_device_type": 2 00:15:53.493 } 00:15:53.493 ], 00:15:53.493 "driver_specific": {} 00:15:53.493 }' 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.493 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:53.752 11:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:54.011 [2024-05-14 11:52:21.002964] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:54.011 [2024-05-14 11:52:21.002989] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:54.011 [2024-05-14 11:52:21.003038] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.011 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:54.270 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:54.270 "name": "Existed_Raid", 00:15:54.270 "uuid": "a2f507ff-9101-4a13-a34c-4b6329b82d7e", 00:15:54.270 "strip_size_kb": 64, 00:15:54.270 "state": "offline", 00:15:54.270 "raid_level": "raid0", 00:15:54.270 "superblock": true, 00:15:54.270 "num_base_bdevs": 4, 00:15:54.270 "num_base_bdevs_discovered": 3, 00:15:54.270 "num_base_bdevs_operational": 3, 00:15:54.270 "base_bdevs_list": [ 00:15:54.270 { 00:15:54.270 "name": null, 00:15:54.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:54.270 "is_configured": false, 00:15:54.270 "data_offset": 2048, 00:15:54.270 "data_size": 63488 00:15:54.270 }, 00:15:54.270 { 00:15:54.270 "name": "BaseBdev2", 00:15:54.270 "uuid": "f17475b5-0cca-4f96-b055-410eb3d4a606", 00:15:54.270 "is_configured": true, 00:15:54.270 "data_offset": 2048, 00:15:54.270 "data_size": 63488 00:15:54.270 }, 00:15:54.270 { 00:15:54.270 "name": "BaseBdev3", 00:15:54.270 "uuid": "23762620-6abb-47cd-be89-34382046be45", 00:15:54.270 "is_configured": true, 00:15:54.271 "data_offset": 2048, 00:15:54.271 "data_size": 63488 00:15:54.271 }, 00:15:54.271 { 00:15:54.271 "name": "BaseBdev4", 00:15:54.271 "uuid": "40bcd81d-7738-4a40-b655-cdada4692d82", 00:15:54.271 "is_configured": true, 00:15:54.271 "data_offset": 2048, 00:15:54.271 "data_size": 63488 00:15:54.271 } 00:15:54.271 ] 00:15:54.271 }' 00:15:54.271 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:54.271 11:52:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.838 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:54.838 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:54.838 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.838 11:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:55.097 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:55.097 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:55.097 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:55.357 [2024-05-14 11:52:22.315467] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:55.357 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:55.357 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:55.357 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.357 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:55.615 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:55.615 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:55.615 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:55.874 [2024-05-14 11:52:22.809230] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:55.874 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:55.874 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:55.874 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.874 11:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:56.133 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:56.133 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.133 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:56.393 [2024-05-14 11:52:23.302911] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:56.393 [2024-05-14 11:52:23.302953] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19e91b0 name Existed_Raid, state offline 00:15:56.393 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:56.393 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:56.393 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.393 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:56.653 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:56.912 BaseBdev2 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:56.912 11:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.171 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:57.430 [ 00:15:57.430 { 00:15:57.430 "name": "BaseBdev2", 00:15:57.430 "aliases": [ 00:15:57.430 "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3" 00:15:57.430 ], 00:15:57.430 "product_name": "Malloc disk", 00:15:57.430 "block_size": 512, 00:15:57.430 "num_blocks": 65536, 00:15:57.431 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:15:57.431 "assigned_rate_limits": { 00:15:57.431 "rw_ios_per_sec": 0, 00:15:57.431 "rw_mbytes_per_sec": 0, 00:15:57.431 "r_mbytes_per_sec": 0, 00:15:57.431 "w_mbytes_per_sec": 0 00:15:57.431 }, 00:15:57.431 "claimed": false, 00:15:57.431 "zoned": false, 00:15:57.431 "supported_io_types": { 00:15:57.431 "read": true, 00:15:57.431 "write": true, 00:15:57.431 "unmap": true, 00:15:57.431 "write_zeroes": true, 00:15:57.431 "flush": true, 00:15:57.431 "reset": true, 00:15:57.431 "compare": false, 00:15:57.431 "compare_and_write": false, 00:15:57.431 "abort": true, 00:15:57.431 "nvme_admin": false, 00:15:57.431 "nvme_io": false 00:15:57.431 }, 00:15:57.431 "memory_domains": [ 00:15:57.431 { 00:15:57.431 "dma_device_id": "system", 00:15:57.431 "dma_device_type": 1 00:15:57.431 }, 00:15:57.431 { 00:15:57.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.431 "dma_device_type": 2 00:15:57.431 } 00:15:57.431 ], 00:15:57.431 "driver_specific": {} 00:15:57.431 } 00:15:57.431 ] 00:15:57.431 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:57.431 11:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:57.431 11:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:57.431 11:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:57.690 BaseBdev3 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:57.690 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.949 11:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:57.949 [ 00:15:57.949 { 00:15:57.949 "name": "BaseBdev3", 00:15:57.949 "aliases": [ 00:15:57.949 "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b" 00:15:57.949 ], 00:15:57.949 "product_name": "Malloc disk", 00:15:57.949 "block_size": 512, 00:15:57.949 "num_blocks": 65536, 00:15:57.949 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:15:57.949 "assigned_rate_limits": { 00:15:57.949 "rw_ios_per_sec": 0, 00:15:57.949 "rw_mbytes_per_sec": 0, 00:15:57.949 "r_mbytes_per_sec": 0, 00:15:57.949 "w_mbytes_per_sec": 0 00:15:57.949 }, 00:15:57.949 "claimed": false, 00:15:57.949 "zoned": false, 00:15:57.949 "supported_io_types": { 00:15:57.949 "read": true, 00:15:57.949 "write": true, 00:15:57.949 "unmap": true, 00:15:57.949 "write_zeroes": true, 00:15:57.949 "flush": true, 00:15:57.949 "reset": true, 00:15:57.949 "compare": false, 00:15:57.949 "compare_and_write": false, 00:15:57.949 "abort": true, 00:15:57.949 "nvme_admin": false, 00:15:57.949 "nvme_io": false 00:15:57.949 }, 00:15:57.949 "memory_domains": [ 00:15:57.949 { 00:15:57.949 "dma_device_id": "system", 00:15:57.949 "dma_device_type": 1 00:15:57.949 }, 00:15:57.949 { 00:15:57.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.949 "dma_device_type": 2 00:15:57.949 } 00:15:57.949 ], 00:15:57.949 "driver_specific": {} 00:15:57.949 } 00:15:57.949 ] 00:15:57.949 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:57.949 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:57.949 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:57.949 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:58.208 BaseBdev4 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:58.208 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.467 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:58.727 [ 00:15:58.727 { 00:15:58.727 "name": "BaseBdev4", 00:15:58.727 "aliases": [ 00:15:58.727 "db8990a9-8ce2-4098-9b17-65a0827417f5" 00:15:58.727 ], 00:15:58.727 "product_name": "Malloc disk", 00:15:58.727 "block_size": 512, 00:15:58.727 "num_blocks": 65536, 00:15:58.727 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:15:58.727 "assigned_rate_limits": { 00:15:58.727 "rw_ios_per_sec": 0, 00:15:58.727 "rw_mbytes_per_sec": 0, 00:15:58.727 "r_mbytes_per_sec": 0, 00:15:58.727 "w_mbytes_per_sec": 0 00:15:58.727 }, 00:15:58.727 "claimed": false, 00:15:58.727 "zoned": false, 00:15:58.727 "supported_io_types": { 00:15:58.727 "read": true, 00:15:58.727 "write": true, 00:15:58.727 "unmap": true, 00:15:58.727 "write_zeroes": true, 00:15:58.727 "flush": true, 00:15:58.727 "reset": true, 00:15:58.727 "compare": false, 00:15:58.727 "compare_and_write": false, 00:15:58.727 "abort": true, 00:15:58.727 "nvme_admin": false, 00:15:58.727 "nvme_io": false 00:15:58.727 }, 00:15:58.727 "memory_domains": [ 00:15:58.727 { 00:15:58.727 "dma_device_id": "system", 00:15:58.727 "dma_device_type": 1 00:15:58.727 }, 00:15:58.727 { 00:15:58.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.727 "dma_device_type": 2 00:15:58.727 } 00:15:58.727 ], 00:15:58.727 "driver_specific": {} 00:15:58.727 } 00:15:58.727 ] 00:15:58.727 11:52:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:58.727 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:58.727 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:58.727 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:58.986 [2024-05-14 11:52:25.970959] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.986 [2024-05-14 11:52:25.970997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.986 [2024-05-14 11:52:25.971016] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:58.986 [2024-05-14 11:52:25.972434] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:58.986 [2024-05-14 11:52:25.972478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.986 11:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.246 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:59.246 "name": "Existed_Raid", 00:15:59.246 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:15:59.246 "strip_size_kb": 64, 00:15:59.246 "state": "configuring", 00:15:59.246 "raid_level": "raid0", 00:15:59.246 "superblock": true, 00:15:59.246 "num_base_bdevs": 4, 00:15:59.246 "num_base_bdevs_discovered": 3, 00:15:59.246 "num_base_bdevs_operational": 4, 00:15:59.246 "base_bdevs_list": [ 00:15:59.246 { 00:15:59.246 "name": "BaseBdev1", 00:15:59.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.246 "is_configured": false, 00:15:59.246 "data_offset": 0, 00:15:59.246 "data_size": 0 00:15:59.246 }, 00:15:59.246 { 00:15:59.246 "name": "BaseBdev2", 00:15:59.246 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:15:59.246 "is_configured": true, 00:15:59.246 "data_offset": 2048, 00:15:59.246 "data_size": 63488 00:15:59.246 }, 00:15:59.246 { 00:15:59.246 "name": "BaseBdev3", 00:15:59.246 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:15:59.246 "is_configured": true, 00:15:59.246 "data_offset": 2048, 00:15:59.246 "data_size": 63488 00:15:59.246 }, 00:15:59.246 { 00:15:59.246 "name": "BaseBdev4", 00:15:59.246 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:15:59.246 "is_configured": true, 00:15:59.246 "data_offset": 2048, 00:15:59.246 "data_size": 63488 00:15:59.246 } 00:15:59.246 ] 00:15:59.246 }' 00:15:59.246 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:59.246 11:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.814 11:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:00.073 [2024-05-14 11:52:27.049851] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.073 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.333 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:00.333 "name": "Existed_Raid", 00:16:00.333 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:00.333 "strip_size_kb": 64, 00:16:00.333 "state": "configuring", 00:16:00.333 "raid_level": "raid0", 00:16:00.333 "superblock": true, 00:16:00.333 "num_base_bdevs": 4, 00:16:00.333 "num_base_bdevs_discovered": 2, 00:16:00.333 "num_base_bdevs_operational": 4, 00:16:00.333 "base_bdevs_list": [ 00:16:00.333 { 00:16:00.333 "name": "BaseBdev1", 00:16:00.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.333 "is_configured": false, 00:16:00.333 "data_offset": 0, 00:16:00.333 "data_size": 0 00:16:00.333 }, 00:16:00.333 { 00:16:00.333 "name": null, 00:16:00.333 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:00.333 "is_configured": false, 00:16:00.333 "data_offset": 2048, 00:16:00.333 "data_size": 63488 00:16:00.333 }, 00:16:00.333 { 00:16:00.333 "name": "BaseBdev3", 00:16:00.333 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:00.333 "is_configured": true, 00:16:00.333 "data_offset": 2048, 00:16:00.333 "data_size": 63488 00:16:00.333 }, 00:16:00.333 { 00:16:00.333 "name": "BaseBdev4", 00:16:00.333 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:00.333 "is_configured": true, 00:16:00.333 "data_offset": 2048, 00:16:00.333 "data_size": 63488 00:16:00.333 } 00:16:00.333 ] 00:16:00.333 }' 00:16:00.333 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:00.333 11:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.902 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.902 11:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:01.162 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:01.162 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:01.162 [2024-05-14 11:52:28.237574] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.162 BaseBdev1 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.421 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.680 [ 00:16:01.680 { 00:16:01.680 "name": "BaseBdev1", 00:16:01.680 "aliases": [ 00:16:01.680 "c97aa614-0fd1-4b92-8125-94862342e8dc" 00:16:01.680 ], 00:16:01.680 "product_name": "Malloc disk", 00:16:01.680 "block_size": 512, 00:16:01.680 "num_blocks": 65536, 00:16:01.680 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:01.680 "assigned_rate_limits": { 00:16:01.680 "rw_ios_per_sec": 0, 00:16:01.680 "rw_mbytes_per_sec": 0, 00:16:01.680 "r_mbytes_per_sec": 0, 00:16:01.680 "w_mbytes_per_sec": 0 00:16:01.680 }, 00:16:01.680 "claimed": true, 00:16:01.680 "claim_type": "exclusive_write", 00:16:01.680 "zoned": false, 00:16:01.680 "supported_io_types": { 00:16:01.680 "read": true, 00:16:01.680 "write": true, 00:16:01.680 "unmap": true, 00:16:01.680 "write_zeroes": true, 00:16:01.680 "flush": true, 00:16:01.680 "reset": true, 00:16:01.680 "compare": false, 00:16:01.680 "compare_and_write": false, 00:16:01.680 "abort": true, 00:16:01.680 "nvme_admin": false, 00:16:01.680 "nvme_io": false 00:16:01.680 }, 00:16:01.680 "memory_domains": [ 00:16:01.680 { 00:16:01.680 "dma_device_id": "system", 00:16:01.680 "dma_device_type": 1 00:16:01.680 }, 00:16:01.680 { 00:16:01.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.680 "dma_device_type": 2 00:16:01.680 } 00:16:01.680 ], 00:16:01.680 "driver_specific": {} 00:16:01.680 } 00:16:01.680 ] 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.680 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.939 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:01.939 "name": "Existed_Raid", 00:16:01.939 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:01.939 "strip_size_kb": 64, 00:16:01.939 "state": "configuring", 00:16:01.939 "raid_level": "raid0", 00:16:01.939 "superblock": true, 00:16:01.939 "num_base_bdevs": 4, 00:16:01.939 "num_base_bdevs_discovered": 3, 00:16:01.939 "num_base_bdevs_operational": 4, 00:16:01.939 "base_bdevs_list": [ 00:16:01.939 { 00:16:01.939 "name": "BaseBdev1", 00:16:01.939 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:01.939 "is_configured": true, 00:16:01.939 "data_offset": 2048, 00:16:01.939 "data_size": 63488 00:16:01.939 }, 00:16:01.939 { 00:16:01.939 "name": null, 00:16:01.939 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:01.939 "is_configured": false, 00:16:01.939 "data_offset": 2048, 00:16:01.939 "data_size": 63488 00:16:01.939 }, 00:16:01.939 { 00:16:01.939 "name": "BaseBdev3", 00:16:01.939 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:01.939 "is_configured": true, 00:16:01.939 "data_offset": 2048, 00:16:01.939 "data_size": 63488 00:16:01.939 }, 00:16:01.939 { 00:16:01.939 "name": "BaseBdev4", 00:16:01.939 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:01.939 "is_configured": true, 00:16:01.939 "data_offset": 2048, 00:16:01.939 "data_size": 63488 00:16:01.939 } 00:16:01.939 ] 00:16:01.939 }' 00:16:01.939 11:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:01.939 11:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.507 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:02.507 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.766 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:02.766 11:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:03.057 [2024-05-14 11:52:30.046431] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.057 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.316 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:03.316 "name": "Existed_Raid", 00:16:03.316 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:03.316 "strip_size_kb": 64, 00:16:03.316 "state": "configuring", 00:16:03.316 "raid_level": "raid0", 00:16:03.316 "superblock": true, 00:16:03.316 "num_base_bdevs": 4, 00:16:03.316 "num_base_bdevs_discovered": 2, 00:16:03.316 "num_base_bdevs_operational": 4, 00:16:03.316 "base_bdevs_list": [ 00:16:03.316 { 00:16:03.316 "name": "BaseBdev1", 00:16:03.316 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:03.316 "is_configured": true, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": null, 00:16:03.316 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:03.316 "is_configured": false, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": null, 00:16:03.316 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:03.316 "is_configured": false, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": "BaseBdev4", 00:16:03.316 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:03.316 "is_configured": true, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 } 00:16:03.316 ] 00:16:03.316 }' 00:16:03.316 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:03.316 11:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.884 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.884 11:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:04.142 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:04.142 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:04.402 [2024-05-14 11:52:31.406044] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.402 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.661 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:04.661 "name": "Existed_Raid", 00:16:04.661 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:04.661 "strip_size_kb": 64, 00:16:04.661 "state": "configuring", 00:16:04.661 "raid_level": "raid0", 00:16:04.661 "superblock": true, 00:16:04.661 "num_base_bdevs": 4, 00:16:04.661 "num_base_bdevs_discovered": 3, 00:16:04.661 "num_base_bdevs_operational": 4, 00:16:04.661 "base_bdevs_list": [ 00:16:04.661 { 00:16:04.661 "name": "BaseBdev1", 00:16:04.661 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:04.661 "is_configured": true, 00:16:04.661 "data_offset": 2048, 00:16:04.661 "data_size": 63488 00:16:04.661 }, 00:16:04.661 { 00:16:04.661 "name": null, 00:16:04.661 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:04.661 "is_configured": false, 00:16:04.661 "data_offset": 2048, 00:16:04.661 "data_size": 63488 00:16:04.661 }, 00:16:04.661 { 00:16:04.661 "name": "BaseBdev3", 00:16:04.661 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:04.661 "is_configured": true, 00:16:04.661 "data_offset": 2048, 00:16:04.661 "data_size": 63488 00:16:04.661 }, 00:16:04.661 { 00:16:04.661 "name": "BaseBdev4", 00:16:04.661 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:04.661 "is_configured": true, 00:16:04.661 "data_offset": 2048, 00:16:04.661 "data_size": 63488 00:16:04.661 } 00:16:04.661 ] 00:16:04.661 }' 00:16:04.661 11:52:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:04.661 11:52:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.229 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:05.229 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.487 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:05.487 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:05.746 [2024-05-14 11:52:32.733584] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.746 11:52:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.005 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:06.005 "name": "Existed_Raid", 00:16:06.005 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:06.005 "strip_size_kb": 64, 00:16:06.005 "state": "configuring", 00:16:06.005 "raid_level": "raid0", 00:16:06.005 "superblock": true, 00:16:06.005 "num_base_bdevs": 4, 00:16:06.005 "num_base_bdevs_discovered": 2, 00:16:06.005 "num_base_bdevs_operational": 4, 00:16:06.005 "base_bdevs_list": [ 00:16:06.005 { 00:16:06.005 "name": null, 00:16:06.005 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:06.005 "is_configured": false, 00:16:06.005 "data_offset": 2048, 00:16:06.005 "data_size": 63488 00:16:06.005 }, 00:16:06.005 { 00:16:06.005 "name": null, 00:16:06.005 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:06.005 "is_configured": false, 00:16:06.005 "data_offset": 2048, 00:16:06.005 "data_size": 63488 00:16:06.005 }, 00:16:06.005 { 00:16:06.005 "name": "BaseBdev3", 00:16:06.005 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:06.005 "is_configured": true, 00:16:06.005 "data_offset": 2048, 00:16:06.005 "data_size": 63488 00:16:06.005 }, 00:16:06.005 { 00:16:06.005 "name": "BaseBdev4", 00:16:06.005 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:06.005 "is_configured": true, 00:16:06.005 "data_offset": 2048, 00:16:06.005 "data_size": 63488 00:16:06.005 } 00:16:06.005 ] 00:16:06.005 }' 00:16:06.005 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:06.005 11:52:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.573 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.573 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:06.832 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:06.832 11:52:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:07.091 [2024-05-14 11:52:34.072316] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.091 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.350 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:07.350 "name": "Existed_Raid", 00:16:07.350 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:07.350 "strip_size_kb": 64, 00:16:07.350 "state": "configuring", 00:16:07.350 "raid_level": "raid0", 00:16:07.350 "superblock": true, 00:16:07.350 "num_base_bdevs": 4, 00:16:07.350 "num_base_bdevs_discovered": 3, 00:16:07.350 "num_base_bdevs_operational": 4, 00:16:07.350 "base_bdevs_list": [ 00:16:07.350 { 00:16:07.350 "name": null, 00:16:07.350 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:07.350 "is_configured": false, 00:16:07.350 "data_offset": 2048, 00:16:07.350 "data_size": 63488 00:16:07.350 }, 00:16:07.350 { 00:16:07.351 "name": "BaseBdev2", 00:16:07.351 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:07.351 "is_configured": true, 00:16:07.351 "data_offset": 2048, 00:16:07.351 "data_size": 63488 00:16:07.351 }, 00:16:07.351 { 00:16:07.351 "name": "BaseBdev3", 00:16:07.351 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:07.351 "is_configured": true, 00:16:07.351 "data_offset": 2048, 00:16:07.351 "data_size": 63488 00:16:07.351 }, 00:16:07.351 { 00:16:07.351 "name": "BaseBdev4", 00:16:07.351 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:07.351 "is_configured": true, 00:16:07.351 "data_offset": 2048, 00:16:07.351 "data_size": 63488 00:16:07.351 } 00:16:07.351 ] 00:16:07.351 }' 00:16:07.351 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:07.351 11:52:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.919 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.919 11:52:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:08.177 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:08.177 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.177 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:08.435 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c97aa614-0fd1-4b92-8125-94862342e8dc 00:16:08.695 [2024-05-14 11:52:35.664076] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:08.695 [2024-05-14 11:52:35.664236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b9a920 00:16:08.695 [2024-05-14 11:52:35.664256] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:08.695 [2024-05-14 11:52:35.664447] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ea4f0 00:16:08.695 [2024-05-14 11:52:35.664569] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b9a920 00:16:08.695 [2024-05-14 11:52:35.664579] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b9a920 00:16:08.695 [2024-05-14 11:52:35.664672] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:08.695 NewBaseBdev 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:08.695 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.954 11:52:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:09.214 [ 00:16:09.214 { 00:16:09.214 "name": "NewBaseBdev", 00:16:09.214 "aliases": [ 00:16:09.214 "c97aa614-0fd1-4b92-8125-94862342e8dc" 00:16:09.214 ], 00:16:09.214 "product_name": "Malloc disk", 00:16:09.214 "block_size": 512, 00:16:09.214 "num_blocks": 65536, 00:16:09.214 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:09.214 "assigned_rate_limits": { 00:16:09.214 "rw_ios_per_sec": 0, 00:16:09.214 "rw_mbytes_per_sec": 0, 00:16:09.214 "r_mbytes_per_sec": 0, 00:16:09.214 "w_mbytes_per_sec": 0 00:16:09.214 }, 00:16:09.214 "claimed": true, 00:16:09.214 "claim_type": "exclusive_write", 00:16:09.214 "zoned": false, 00:16:09.214 "supported_io_types": { 00:16:09.214 "read": true, 00:16:09.214 "write": true, 00:16:09.214 "unmap": true, 00:16:09.214 "write_zeroes": true, 00:16:09.214 "flush": true, 00:16:09.214 "reset": true, 00:16:09.214 "compare": false, 00:16:09.214 "compare_and_write": false, 00:16:09.214 "abort": true, 00:16:09.214 "nvme_admin": false, 00:16:09.214 "nvme_io": false 00:16:09.214 }, 00:16:09.214 "memory_domains": [ 00:16:09.214 { 00:16:09.214 "dma_device_id": "system", 00:16:09.214 "dma_device_type": 1 00:16:09.214 }, 00:16:09.214 { 00:16:09.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.214 "dma_device_type": 2 00:16:09.214 } 00:16:09.214 ], 00:16:09.214 "driver_specific": {} 00:16:09.214 } 00:16:09.214 ] 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.214 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.473 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:09.473 "name": "Existed_Raid", 00:16:09.473 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:09.473 "strip_size_kb": 64, 00:16:09.473 "state": "online", 00:16:09.473 "raid_level": "raid0", 00:16:09.473 "superblock": true, 00:16:09.473 "num_base_bdevs": 4, 00:16:09.473 "num_base_bdevs_discovered": 4, 00:16:09.473 "num_base_bdevs_operational": 4, 00:16:09.473 "base_bdevs_list": [ 00:16:09.473 { 00:16:09.473 "name": "NewBaseBdev", 00:16:09.473 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:09.473 "is_configured": true, 00:16:09.473 "data_offset": 2048, 00:16:09.473 "data_size": 63488 00:16:09.473 }, 00:16:09.473 { 00:16:09.473 "name": "BaseBdev2", 00:16:09.473 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:09.473 "is_configured": true, 00:16:09.473 "data_offset": 2048, 00:16:09.473 "data_size": 63488 00:16:09.473 }, 00:16:09.473 { 00:16:09.473 "name": "BaseBdev3", 00:16:09.473 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:09.473 "is_configured": true, 00:16:09.473 "data_offset": 2048, 00:16:09.473 "data_size": 63488 00:16:09.473 }, 00:16:09.473 { 00:16:09.473 "name": "BaseBdev4", 00:16:09.473 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:09.473 "is_configured": true, 00:16:09.473 "data_offset": 2048, 00:16:09.473 "data_size": 63488 00:16:09.473 } 00:16:09.473 ] 00:16:09.473 }' 00:16:09.473 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:09.473 11:52:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:10.039 11:52:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:10.298 [2024-05-14 11:52:37.200467] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:10.298 "name": "Existed_Raid", 00:16:10.298 "aliases": [ 00:16:10.298 "fd298d14-8664-428e-8f05-845536713ecb" 00:16:10.298 ], 00:16:10.298 "product_name": "Raid Volume", 00:16:10.298 "block_size": 512, 00:16:10.298 "num_blocks": 253952, 00:16:10.298 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:10.298 "assigned_rate_limits": { 00:16:10.298 "rw_ios_per_sec": 0, 00:16:10.298 "rw_mbytes_per_sec": 0, 00:16:10.298 "r_mbytes_per_sec": 0, 00:16:10.298 "w_mbytes_per_sec": 0 00:16:10.298 }, 00:16:10.298 "claimed": false, 00:16:10.298 "zoned": false, 00:16:10.298 "supported_io_types": { 00:16:10.298 "read": true, 00:16:10.298 "write": true, 00:16:10.298 "unmap": true, 00:16:10.298 "write_zeroes": true, 00:16:10.298 "flush": true, 00:16:10.298 "reset": true, 00:16:10.298 "compare": false, 00:16:10.298 "compare_and_write": false, 00:16:10.298 "abort": false, 00:16:10.298 "nvme_admin": false, 00:16:10.298 "nvme_io": false 00:16:10.298 }, 00:16:10.298 "memory_domains": [ 00:16:10.298 { 00:16:10.298 "dma_device_id": "system", 00:16:10.298 "dma_device_type": 1 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.298 "dma_device_type": 2 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "system", 00:16:10.298 "dma_device_type": 1 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.298 "dma_device_type": 2 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "system", 00:16:10.298 "dma_device_type": 1 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.298 "dma_device_type": 2 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "system", 00:16:10.298 "dma_device_type": 1 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.298 "dma_device_type": 2 00:16:10.298 } 00:16:10.298 ], 00:16:10.298 "driver_specific": { 00:16:10.298 "raid": { 00:16:10.298 "uuid": "fd298d14-8664-428e-8f05-845536713ecb", 00:16:10.298 "strip_size_kb": 64, 00:16:10.298 "state": "online", 00:16:10.298 "raid_level": "raid0", 00:16:10.298 "superblock": true, 00:16:10.298 "num_base_bdevs": 4, 00:16:10.298 "num_base_bdevs_discovered": 4, 00:16:10.298 "num_base_bdevs_operational": 4, 00:16:10.298 "base_bdevs_list": [ 00:16:10.298 { 00:16:10.298 "name": "NewBaseBdev", 00:16:10.298 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:10.298 "is_configured": true, 00:16:10.298 "data_offset": 2048, 00:16:10.298 "data_size": 63488 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "name": "BaseBdev2", 00:16:10.298 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:10.298 "is_configured": true, 00:16:10.298 "data_offset": 2048, 00:16:10.298 "data_size": 63488 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "name": "BaseBdev3", 00:16:10.298 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:10.298 "is_configured": true, 00:16:10.298 "data_offset": 2048, 00:16:10.298 "data_size": 63488 00:16:10.298 }, 00:16:10.298 { 00:16:10.298 "name": "BaseBdev4", 00:16:10.298 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:10.298 "is_configured": true, 00:16:10.298 "data_offset": 2048, 00:16:10.298 "data_size": 63488 00:16:10.298 } 00:16:10.298 ] 00:16:10.298 } 00:16:10.298 } 00:16:10.298 }' 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:16:10.298 BaseBdev2 00:16:10.298 BaseBdev3 00:16:10.298 BaseBdev4' 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:10.298 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:10.557 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:10.557 "name": "NewBaseBdev", 00:16:10.557 "aliases": [ 00:16:10.557 "c97aa614-0fd1-4b92-8125-94862342e8dc" 00:16:10.557 ], 00:16:10.557 "product_name": "Malloc disk", 00:16:10.557 "block_size": 512, 00:16:10.557 "num_blocks": 65536, 00:16:10.557 "uuid": "c97aa614-0fd1-4b92-8125-94862342e8dc", 00:16:10.557 "assigned_rate_limits": { 00:16:10.557 "rw_ios_per_sec": 0, 00:16:10.557 "rw_mbytes_per_sec": 0, 00:16:10.557 "r_mbytes_per_sec": 0, 00:16:10.557 "w_mbytes_per_sec": 0 00:16:10.557 }, 00:16:10.557 "claimed": true, 00:16:10.557 "claim_type": "exclusive_write", 00:16:10.557 "zoned": false, 00:16:10.557 "supported_io_types": { 00:16:10.557 "read": true, 00:16:10.557 "write": true, 00:16:10.557 "unmap": true, 00:16:10.557 "write_zeroes": true, 00:16:10.557 "flush": true, 00:16:10.557 "reset": true, 00:16:10.557 "compare": false, 00:16:10.557 "compare_and_write": false, 00:16:10.557 "abort": true, 00:16:10.557 "nvme_admin": false, 00:16:10.557 "nvme_io": false 00:16:10.557 }, 00:16:10.557 "memory_domains": [ 00:16:10.557 { 00:16:10.557 "dma_device_id": "system", 00:16:10.557 "dma_device_type": 1 00:16:10.557 }, 00:16:10.557 { 00:16:10.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.557 "dma_device_type": 2 00:16:10.557 } 00:16:10.557 ], 00:16:10.557 "driver_specific": {} 00:16:10.557 }' 00:16:10.557 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:10.557 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:10.557 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:10.557 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:10.816 11:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:11.075 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:11.075 "name": "BaseBdev2", 00:16:11.075 "aliases": [ 00:16:11.075 "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3" 00:16:11.075 ], 00:16:11.075 "product_name": "Malloc disk", 00:16:11.075 "block_size": 512, 00:16:11.075 "num_blocks": 65536, 00:16:11.075 "uuid": "bedb8ac8-7cfe-4efc-bab6-a8ca5d0c0dd3", 00:16:11.075 "assigned_rate_limits": { 00:16:11.075 "rw_ios_per_sec": 0, 00:16:11.075 "rw_mbytes_per_sec": 0, 00:16:11.075 "r_mbytes_per_sec": 0, 00:16:11.075 "w_mbytes_per_sec": 0 00:16:11.075 }, 00:16:11.075 "claimed": true, 00:16:11.075 "claim_type": "exclusive_write", 00:16:11.075 "zoned": false, 00:16:11.075 "supported_io_types": { 00:16:11.075 "read": true, 00:16:11.075 "write": true, 00:16:11.075 "unmap": true, 00:16:11.075 "write_zeroes": true, 00:16:11.075 "flush": true, 00:16:11.075 "reset": true, 00:16:11.075 "compare": false, 00:16:11.075 "compare_and_write": false, 00:16:11.075 "abort": true, 00:16:11.075 "nvme_admin": false, 00:16:11.075 "nvme_io": false 00:16:11.075 }, 00:16:11.075 "memory_domains": [ 00:16:11.075 { 00:16:11.075 "dma_device_id": "system", 00:16:11.075 "dma_device_type": 1 00:16:11.075 }, 00:16:11.075 { 00:16:11.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.075 "dma_device_type": 2 00:16:11.075 } 00:16:11.075 ], 00:16:11.075 "driver_specific": {} 00:16:11.075 }' 00:16:11.075 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.333 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:11.334 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:11.334 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.334 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:11.592 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:11.592 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:11.592 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:11.592 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:11.592 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:11.851 "name": "BaseBdev3", 00:16:11.851 "aliases": [ 00:16:11.851 "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b" 00:16:11.851 ], 00:16:11.851 "product_name": "Malloc disk", 00:16:11.851 "block_size": 512, 00:16:11.851 "num_blocks": 65536, 00:16:11.851 "uuid": "c34439fa-2e6d-4c5b-b2ab-7a089b3eb19b", 00:16:11.851 "assigned_rate_limits": { 00:16:11.851 "rw_ios_per_sec": 0, 00:16:11.851 "rw_mbytes_per_sec": 0, 00:16:11.851 "r_mbytes_per_sec": 0, 00:16:11.851 "w_mbytes_per_sec": 0 00:16:11.851 }, 00:16:11.851 "claimed": true, 00:16:11.851 "claim_type": "exclusive_write", 00:16:11.851 "zoned": false, 00:16:11.851 "supported_io_types": { 00:16:11.851 "read": true, 00:16:11.851 "write": true, 00:16:11.851 "unmap": true, 00:16:11.851 "write_zeroes": true, 00:16:11.851 "flush": true, 00:16:11.851 "reset": true, 00:16:11.851 "compare": false, 00:16:11.851 "compare_and_write": false, 00:16:11.851 "abort": true, 00:16:11.851 "nvme_admin": false, 00:16:11.851 "nvme_io": false 00:16:11.851 }, 00:16:11.851 "memory_domains": [ 00:16:11.851 { 00:16:11.851 "dma_device_id": "system", 00:16:11.851 "dma_device_type": 1 00:16:11.851 }, 00:16:11.851 { 00:16:11.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.851 "dma_device_type": 2 00:16:11.851 } 00:16:11.851 ], 00:16:11.851 "driver_specific": {} 00:16:11.851 }' 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.851 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.110 11:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.110 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:12.110 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:12.110 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:12.110 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:12.368 "name": "BaseBdev4", 00:16:12.368 "aliases": [ 00:16:12.368 "db8990a9-8ce2-4098-9b17-65a0827417f5" 00:16:12.368 ], 00:16:12.368 "product_name": "Malloc disk", 00:16:12.368 "block_size": 512, 00:16:12.368 "num_blocks": 65536, 00:16:12.368 "uuid": "db8990a9-8ce2-4098-9b17-65a0827417f5", 00:16:12.368 "assigned_rate_limits": { 00:16:12.368 "rw_ios_per_sec": 0, 00:16:12.368 "rw_mbytes_per_sec": 0, 00:16:12.368 "r_mbytes_per_sec": 0, 00:16:12.368 "w_mbytes_per_sec": 0 00:16:12.368 }, 00:16:12.368 "claimed": true, 00:16:12.368 "claim_type": "exclusive_write", 00:16:12.368 "zoned": false, 00:16:12.368 "supported_io_types": { 00:16:12.368 "read": true, 00:16:12.368 "write": true, 00:16:12.368 "unmap": true, 00:16:12.368 "write_zeroes": true, 00:16:12.368 "flush": true, 00:16:12.368 "reset": true, 00:16:12.368 "compare": false, 00:16:12.368 "compare_and_write": false, 00:16:12.368 "abort": true, 00:16:12.368 "nvme_admin": false, 00:16:12.368 "nvme_io": false 00:16:12.368 }, 00:16:12.368 "memory_domains": [ 00:16:12.368 { 00:16:12.368 "dma_device_id": "system", 00:16:12.368 "dma_device_type": 1 00:16:12.368 }, 00:16:12.368 { 00:16:12.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.368 "dma_device_type": 2 00:16:12.368 } 00:16:12.368 ], 00:16:12.368 "driver_specific": {} 00:16:12.368 }' 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.368 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:12.627 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:12.886 [2024-05-14 11:52:39.795050] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:12.886 [2024-05-14 11:52:39.795076] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.886 [2024-05-14 11:52:39.795131] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.886 [2024-05-14 11:52:39.795188] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:12.886 [2024-05-14 11:52:39.795200] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b9a920 name Existed_Raid, state offline 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1717019 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1717019 ']' 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1717019 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1717019 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1717019' 00:16:12.886 killing process with pid 1717019 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1717019 00:16:12.886 [2024-05-14 11:52:39.860058] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:12.886 11:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1717019 00:16:12.886 [2024-05-14 11:52:39.897446] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:13.146 11:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:16:13.146 00:16:13.146 real 0m31.648s 00:16:13.146 user 0m58.020s 00:16:13.146 sys 0m5.697s 00:16:13.146 11:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:13.146 11:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.146 ************************************ 00:16:13.146 END TEST raid_state_function_test_sb 00:16:13.146 ************************************ 00:16:13.146 11:52:40 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:16:13.146 11:52:40 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:16:13.146 11:52:40 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:13.146 11:52:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:13.146 ************************************ 00:16:13.146 START TEST raid_superblock_test 00:16:13.146 ************************************ 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 4 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1721909 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1721909 /var/tmp/spdk-raid.sock 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1721909 ']' 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:13.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:13.146 11:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.405 [2024-05-14 11:52:40.265702] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:16:13.405 [2024-05-14 11:52:40.265763] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721909 ] 00:16:13.405 [2024-05-14 11:52:40.395565] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:13.664 [2024-05-14 11:52:40.502801] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:13.664 [2024-05-14 11:52:40.571245] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:13.664 [2024-05-14 11:52:40.571290] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.234 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:14.493 malloc1 00:16:14.493 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:14.752 [2024-05-14 11:52:41.665375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:14.752 [2024-05-14 11:52:41.665428] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:14.752 [2024-05-14 11:52:41.665451] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd422a0 00:16:14.752 [2024-05-14 11:52:41.665463] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:14.752 [2024-05-14 11:52:41.667226] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:14.752 [2024-05-14 11:52:41.667256] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:14.752 pt1 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.752 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:15.011 malloc2 00:16:15.011 11:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:15.271 [2024-05-14 11:52:42.160498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:15.271 [2024-05-14 11:52:42.160544] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.271 [2024-05-14 11:52:42.160569] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef5480 00:16:15.271 [2024-05-14 11:52:42.160581] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.271 [2024-05-14 11:52:42.162193] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.271 [2024-05-14 11:52:42.162220] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:15.271 pt2 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:15.271 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:15.530 malloc3 00:16:15.530 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:15.790 [2024-05-14 11:52:42.655655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:15.790 [2024-05-14 11:52:42.655702] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.790 [2024-05-14 11:52:42.655721] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3be80 00:16:15.790 [2024-05-14 11:52:42.655734] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.790 [2024-05-14 11:52:42.657346] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.790 [2024-05-14 11:52:42.657374] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:15.790 pt3 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:15.790 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:16.049 malloc4 00:16:16.049 11:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:16.308 [2024-05-14 11:52:43.141532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:16.308 [2024-05-14 11:52:43.141580] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.308 [2024-05-14 11:52:43.141605] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3e490 00:16:16.308 [2024-05-14 11:52:43.141619] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.308 [2024-05-14 11:52:43.143183] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.308 [2024-05-14 11:52:43.143212] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:16.308 pt4 00:16:16.308 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:16.308 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:16.308 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:16.308 [2024-05-14 11:52:43.374168] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:16.309 [2024-05-14 11:52:43.375476] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:16.309 [2024-05-14 11:52:43.375530] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:16.309 [2024-05-14 11:52:43.375575] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:16.309 [2024-05-14 11:52:43.375761] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xd3f7a0 00:16:16.309 [2024-05-14 11:52:43.375772] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:16.309 [2024-05-14 11:52:43.375969] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd3f770 00:16:16.309 [2024-05-14 11:52:43.376116] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd3f7a0 00:16:16.309 [2024-05-14 11:52:43.376125] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd3f7a0 00:16:16.309 [2024-05-14 11:52:43.376231] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:16.309 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:16.568 "name": "raid_bdev1", 00:16:16.568 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:16.568 "strip_size_kb": 64, 00:16:16.568 "state": "online", 00:16:16.568 "raid_level": "raid0", 00:16:16.568 "superblock": true, 00:16:16.568 "num_base_bdevs": 4, 00:16:16.568 "num_base_bdevs_discovered": 4, 00:16:16.568 "num_base_bdevs_operational": 4, 00:16:16.568 "base_bdevs_list": [ 00:16:16.568 { 00:16:16.568 "name": "pt1", 00:16:16.568 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:16.568 "is_configured": true, 00:16:16.568 "data_offset": 2048, 00:16:16.568 "data_size": 63488 00:16:16.568 }, 00:16:16.568 { 00:16:16.568 "name": "pt2", 00:16:16.568 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:16.568 "is_configured": true, 00:16:16.568 "data_offset": 2048, 00:16:16.568 "data_size": 63488 00:16:16.568 }, 00:16:16.568 { 00:16:16.568 "name": "pt3", 00:16:16.568 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:16.568 "is_configured": true, 00:16:16.568 "data_offset": 2048, 00:16:16.568 "data_size": 63488 00:16:16.568 }, 00:16:16.568 { 00:16:16.568 "name": "pt4", 00:16:16.568 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:16.568 "is_configured": true, 00:16:16.568 "data_offset": 2048, 00:16:16.568 "data_size": 63488 00:16:16.568 } 00:16:16.568 ] 00:16:16.568 }' 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:16.568 11:52:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:17.166 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:17.425 [2024-05-14 11:52:44.425193] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:17.425 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:17.425 "name": "raid_bdev1", 00:16:17.425 "aliases": [ 00:16:17.425 "0843e91a-eb87-421a-a365-859ad33316f1" 00:16:17.425 ], 00:16:17.425 "product_name": "Raid Volume", 00:16:17.425 "block_size": 512, 00:16:17.425 "num_blocks": 253952, 00:16:17.425 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:17.425 "assigned_rate_limits": { 00:16:17.425 "rw_ios_per_sec": 0, 00:16:17.425 "rw_mbytes_per_sec": 0, 00:16:17.425 "r_mbytes_per_sec": 0, 00:16:17.425 "w_mbytes_per_sec": 0 00:16:17.425 }, 00:16:17.425 "claimed": false, 00:16:17.425 "zoned": false, 00:16:17.425 "supported_io_types": { 00:16:17.425 "read": true, 00:16:17.425 "write": true, 00:16:17.425 "unmap": true, 00:16:17.425 "write_zeroes": true, 00:16:17.425 "flush": true, 00:16:17.425 "reset": true, 00:16:17.425 "compare": false, 00:16:17.425 "compare_and_write": false, 00:16:17.425 "abort": false, 00:16:17.425 "nvme_admin": false, 00:16:17.425 "nvme_io": false 00:16:17.425 }, 00:16:17.425 "memory_domains": [ 00:16:17.425 { 00:16:17.425 "dma_device_id": "system", 00:16:17.425 "dma_device_type": 1 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.425 "dma_device_type": 2 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "system", 00:16:17.425 "dma_device_type": 1 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.425 "dma_device_type": 2 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "system", 00:16:17.425 "dma_device_type": 1 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.425 "dma_device_type": 2 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "system", 00:16:17.425 "dma_device_type": 1 00:16:17.425 }, 00:16:17.425 { 00:16:17.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.425 "dma_device_type": 2 00:16:17.425 } 00:16:17.425 ], 00:16:17.425 "driver_specific": { 00:16:17.425 "raid": { 00:16:17.425 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:17.425 "strip_size_kb": 64, 00:16:17.425 "state": "online", 00:16:17.425 "raid_level": "raid0", 00:16:17.425 "superblock": true, 00:16:17.425 "num_base_bdevs": 4, 00:16:17.426 "num_base_bdevs_discovered": 4, 00:16:17.426 "num_base_bdevs_operational": 4, 00:16:17.426 "base_bdevs_list": [ 00:16:17.426 { 00:16:17.426 "name": "pt1", 00:16:17.426 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:17.426 "is_configured": true, 00:16:17.426 "data_offset": 2048, 00:16:17.426 "data_size": 63488 00:16:17.426 }, 00:16:17.426 { 00:16:17.426 "name": "pt2", 00:16:17.426 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:17.426 "is_configured": true, 00:16:17.426 "data_offset": 2048, 00:16:17.426 "data_size": 63488 00:16:17.426 }, 00:16:17.426 { 00:16:17.426 "name": "pt3", 00:16:17.426 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:17.426 "is_configured": true, 00:16:17.426 "data_offset": 2048, 00:16:17.426 "data_size": 63488 00:16:17.426 }, 00:16:17.426 { 00:16:17.426 "name": "pt4", 00:16:17.426 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:17.426 "is_configured": true, 00:16:17.426 "data_offset": 2048, 00:16:17.426 "data_size": 63488 00:16:17.426 } 00:16:17.426 ] 00:16:17.426 } 00:16:17.426 } 00:16:17.426 }' 00:16:17.426 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:17.426 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:17.426 pt2 00:16:17.426 pt3 00:16:17.426 pt4' 00:16:17.426 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:17.426 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:17.426 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:17.685 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:17.685 "name": "pt1", 00:16:17.685 "aliases": [ 00:16:17.685 "d6920e4e-5cae-5d71-81e8-fea0d4e1d044" 00:16:17.685 ], 00:16:17.685 "product_name": "passthru", 00:16:17.685 "block_size": 512, 00:16:17.685 "num_blocks": 65536, 00:16:17.685 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:17.685 "assigned_rate_limits": { 00:16:17.685 "rw_ios_per_sec": 0, 00:16:17.685 "rw_mbytes_per_sec": 0, 00:16:17.685 "r_mbytes_per_sec": 0, 00:16:17.685 "w_mbytes_per_sec": 0 00:16:17.685 }, 00:16:17.685 "claimed": true, 00:16:17.685 "claim_type": "exclusive_write", 00:16:17.685 "zoned": false, 00:16:17.685 "supported_io_types": { 00:16:17.685 "read": true, 00:16:17.685 "write": true, 00:16:17.685 "unmap": true, 00:16:17.685 "write_zeroes": true, 00:16:17.685 "flush": true, 00:16:17.685 "reset": true, 00:16:17.685 "compare": false, 00:16:17.685 "compare_and_write": false, 00:16:17.685 "abort": true, 00:16:17.685 "nvme_admin": false, 00:16:17.685 "nvme_io": false 00:16:17.685 }, 00:16:17.685 "memory_domains": [ 00:16:17.685 { 00:16:17.685 "dma_device_id": "system", 00:16:17.685 "dma_device_type": 1 00:16:17.685 }, 00:16:17.685 { 00:16:17.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.685 "dma_device_type": 2 00:16:17.685 } 00:16:17.685 ], 00:16:17.685 "driver_specific": { 00:16:17.685 "passthru": { 00:16:17.685 "name": "pt1", 00:16:17.685 "base_bdev_name": "malloc1" 00:16:17.685 } 00:16:17.685 } 00:16:17.685 }' 00:16:17.685 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:17.944 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:17.944 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:17.945 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:17.945 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:17.945 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.945 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:17.945 11:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:17.945 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.945 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:18.203 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:18.203 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:18.203 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:18.203 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:18.203 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:18.463 "name": "pt2", 00:16:18.463 "aliases": [ 00:16:18.463 "a3730f03-3fbe-53a7-ac46-975544db1cd5" 00:16:18.463 ], 00:16:18.463 "product_name": "passthru", 00:16:18.463 "block_size": 512, 00:16:18.463 "num_blocks": 65536, 00:16:18.463 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:18.463 "assigned_rate_limits": { 00:16:18.463 "rw_ios_per_sec": 0, 00:16:18.463 "rw_mbytes_per_sec": 0, 00:16:18.463 "r_mbytes_per_sec": 0, 00:16:18.463 "w_mbytes_per_sec": 0 00:16:18.463 }, 00:16:18.463 "claimed": true, 00:16:18.463 "claim_type": "exclusive_write", 00:16:18.463 "zoned": false, 00:16:18.463 "supported_io_types": { 00:16:18.463 "read": true, 00:16:18.463 "write": true, 00:16:18.463 "unmap": true, 00:16:18.463 "write_zeroes": true, 00:16:18.463 "flush": true, 00:16:18.463 "reset": true, 00:16:18.463 "compare": false, 00:16:18.463 "compare_and_write": false, 00:16:18.463 "abort": true, 00:16:18.463 "nvme_admin": false, 00:16:18.463 "nvme_io": false 00:16:18.463 }, 00:16:18.463 "memory_domains": [ 00:16:18.463 { 00:16:18.463 "dma_device_id": "system", 00:16:18.463 "dma_device_type": 1 00:16:18.463 }, 00:16:18.463 { 00:16:18.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.463 "dma_device_type": 2 00:16:18.463 } 00:16:18.463 ], 00:16:18.463 "driver_specific": { 00:16:18.463 "passthru": { 00:16:18.463 "name": "pt2", 00:16:18.463 "base_bdev_name": "malloc2" 00:16:18.463 } 00:16:18.463 } 00:16:18.463 }' 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.463 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:18.722 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:18.981 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:18.981 "name": "pt3", 00:16:18.981 "aliases": [ 00:16:18.981 "8cfe2fe6-18d0-5e80-a304-74222d818d91" 00:16:18.981 ], 00:16:18.981 "product_name": "passthru", 00:16:18.981 "block_size": 512, 00:16:18.981 "num_blocks": 65536, 00:16:18.981 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:18.981 "assigned_rate_limits": { 00:16:18.981 "rw_ios_per_sec": 0, 00:16:18.981 "rw_mbytes_per_sec": 0, 00:16:18.981 "r_mbytes_per_sec": 0, 00:16:18.981 "w_mbytes_per_sec": 0 00:16:18.981 }, 00:16:18.981 "claimed": true, 00:16:18.981 "claim_type": "exclusive_write", 00:16:18.981 "zoned": false, 00:16:18.981 "supported_io_types": { 00:16:18.981 "read": true, 00:16:18.981 "write": true, 00:16:18.981 "unmap": true, 00:16:18.981 "write_zeroes": true, 00:16:18.981 "flush": true, 00:16:18.981 "reset": true, 00:16:18.981 "compare": false, 00:16:18.981 "compare_and_write": false, 00:16:18.981 "abort": true, 00:16:18.981 "nvme_admin": false, 00:16:18.981 "nvme_io": false 00:16:18.981 }, 00:16:18.981 "memory_domains": [ 00:16:18.981 { 00:16:18.981 "dma_device_id": "system", 00:16:18.981 "dma_device_type": 1 00:16:18.981 }, 00:16:18.981 { 00:16:18.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.981 "dma_device_type": 2 00:16:18.981 } 00:16:18.981 ], 00:16:18.981 "driver_specific": { 00:16:18.981 "passthru": { 00:16:18.981 "name": "pt3", 00:16:18.981 "base_bdev_name": "malloc3" 00:16:18.981 } 00:16:18.981 } 00:16:18.981 }' 00:16:18.981 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:18.981 11:52:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:18.981 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:18.981 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:19.240 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:19.499 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:19.499 "name": "pt4", 00:16:19.499 "aliases": [ 00:16:19.499 "88933881-3d11-52e0-aef2-4e680cf3a6cf" 00:16:19.499 ], 00:16:19.499 "product_name": "passthru", 00:16:19.499 "block_size": 512, 00:16:19.499 "num_blocks": 65536, 00:16:19.499 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:19.499 "assigned_rate_limits": { 00:16:19.499 "rw_ios_per_sec": 0, 00:16:19.499 "rw_mbytes_per_sec": 0, 00:16:19.499 "r_mbytes_per_sec": 0, 00:16:19.499 "w_mbytes_per_sec": 0 00:16:19.499 }, 00:16:19.499 "claimed": true, 00:16:19.499 "claim_type": "exclusive_write", 00:16:19.499 "zoned": false, 00:16:19.499 "supported_io_types": { 00:16:19.499 "read": true, 00:16:19.499 "write": true, 00:16:19.499 "unmap": true, 00:16:19.499 "write_zeroes": true, 00:16:19.499 "flush": true, 00:16:19.499 "reset": true, 00:16:19.499 "compare": false, 00:16:19.499 "compare_and_write": false, 00:16:19.499 "abort": true, 00:16:19.499 "nvme_admin": false, 00:16:19.499 "nvme_io": false 00:16:19.499 }, 00:16:19.499 "memory_domains": [ 00:16:19.499 { 00:16:19.499 "dma_device_id": "system", 00:16:19.499 "dma_device_type": 1 00:16:19.499 }, 00:16:19.499 { 00:16:19.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.499 "dma_device_type": 2 00:16:19.499 } 00:16:19.499 ], 00:16:19.499 "driver_specific": { 00:16:19.499 "passthru": { 00:16:19.499 "name": "pt4", 00:16:19.499 "base_bdev_name": "malloc4" 00:16:19.499 } 00:16:19.499 } 00:16:19.499 }' 00:16:19.499 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:19.499 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.762 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:19.763 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:19.763 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:19.763 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:16:19.763 11:52:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:20.022 [2024-05-14 11:52:47.028151] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:20.022 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=0843e91a-eb87-421a-a365-859ad33316f1 00:16:20.022 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 0843e91a-eb87-421a-a365-859ad33316f1 ']' 00:16:20.022 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:20.281 [2024-05-14 11:52:47.268659] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:20.281 [2024-05-14 11:52:47.268685] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:20.281 [2024-05-14 11:52:47.268740] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:20.281 [2024-05-14 11:52:47.268814] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:20.281 [2024-05-14 11:52:47.268828] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3f7a0 name raid_bdev1, state offline 00:16:20.281 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.281 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:16:20.540 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:16:20.540 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:16:20.540 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:20.540 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:20.799 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:20.799 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:21.058 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:21.058 11:52:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:21.317 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:21.317 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:21.576 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:21.576 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:21.835 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:16:21.835 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:21.835 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:21.836 [2024-05-14 11:52:48.860802] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:21.836 [2024-05-14 11:52:48.862204] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:21.836 [2024-05-14 11:52:48.862248] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:21.836 [2024-05-14 11:52:48.862281] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:21.836 [2024-05-14 11:52:48.862329] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:21.836 [2024-05-14 11:52:48.862384] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:21.836 [2024-05-14 11:52:48.862419] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:21.836 [2024-05-14 11:52:48.862442] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:21.836 [2024-05-14 11:52:48.862460] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:21.836 [2024-05-14 11:52:48.862471] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3e160 name raid_bdev1, state configuring 00:16:21.836 request: 00:16:21.836 { 00:16:21.836 "name": "raid_bdev1", 00:16:21.836 "raid_level": "raid0", 00:16:21.836 "base_bdevs": [ 00:16:21.836 "malloc1", 00:16:21.836 "malloc2", 00:16:21.836 "malloc3", 00:16:21.836 "malloc4" 00:16:21.836 ], 00:16:21.836 "superblock": false, 00:16:21.836 "strip_size_kb": 64, 00:16:21.836 "method": "bdev_raid_create", 00:16:21.836 "req_id": 1 00:16:21.836 } 00:16:21.836 Got JSON-RPC error response 00:16:21.836 response: 00:16:21.836 { 00:16:21.836 "code": -17, 00:16:21.836 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:21.836 } 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.836 11:52:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:16:22.095 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:16:22.095 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:16:22.095 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:22.354 [2024-05-14 11:52:49.333985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:22.354 [2024-05-14 11:52:49.334033] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.354 [2024-05-14 11:52:49.334056] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeeb040 00:16:22.354 [2024-05-14 11:52:49.334069] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.354 [2024-05-14 11:52:49.335763] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.354 [2024-05-14 11:52:49.335793] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:22.354 [2024-05-14 11:52:49.335862] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:16:22.354 [2024-05-14 11:52:49.335889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:22.354 pt1 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:22.354 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.613 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:22.613 "name": "raid_bdev1", 00:16:22.613 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:22.613 "strip_size_kb": 64, 00:16:22.613 "state": "configuring", 00:16:22.613 "raid_level": "raid0", 00:16:22.613 "superblock": true, 00:16:22.613 "num_base_bdevs": 4, 00:16:22.613 "num_base_bdevs_discovered": 1, 00:16:22.613 "num_base_bdevs_operational": 4, 00:16:22.613 "base_bdevs_list": [ 00:16:22.613 { 00:16:22.613 "name": "pt1", 00:16:22.613 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:22.613 "is_configured": true, 00:16:22.613 "data_offset": 2048, 00:16:22.613 "data_size": 63488 00:16:22.613 }, 00:16:22.613 { 00:16:22.613 "name": null, 00:16:22.613 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:22.613 "is_configured": false, 00:16:22.613 "data_offset": 2048, 00:16:22.613 "data_size": 63488 00:16:22.613 }, 00:16:22.613 { 00:16:22.613 "name": null, 00:16:22.613 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:22.613 "is_configured": false, 00:16:22.613 "data_offset": 2048, 00:16:22.613 "data_size": 63488 00:16:22.613 }, 00:16:22.613 { 00:16:22.613 "name": null, 00:16:22.613 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:22.613 "is_configured": false, 00:16:22.613 "data_offset": 2048, 00:16:22.613 "data_size": 63488 00:16:22.613 } 00:16:22.613 ] 00:16:22.613 }' 00:16:22.613 11:52:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:22.613 11:52:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.181 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:16:23.181 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:23.440 [2024-05-14 11:52:50.396962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:23.440 [2024-05-14 11:52:50.397013] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.440 [2024-05-14 11:52:50.397039] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd43aa0 00:16:23.440 [2024-05-14 11:52:50.397051] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.440 [2024-05-14 11:52:50.397394] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.440 [2024-05-14 11:52:50.397421] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:23.440 [2024-05-14 11:52:50.397485] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:16:23.440 [2024-05-14 11:52:50.397505] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:23.440 pt2 00:16:23.440 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:23.699 [2024-05-14 11:52:50.581460] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.699 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:23.959 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:23.959 "name": "raid_bdev1", 00:16:23.959 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:23.959 "strip_size_kb": 64, 00:16:23.959 "state": "configuring", 00:16:23.959 "raid_level": "raid0", 00:16:23.959 "superblock": true, 00:16:23.959 "num_base_bdevs": 4, 00:16:23.959 "num_base_bdevs_discovered": 1, 00:16:23.959 "num_base_bdevs_operational": 4, 00:16:23.959 "base_bdevs_list": [ 00:16:23.959 { 00:16:23.959 "name": "pt1", 00:16:23.959 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:23.959 "is_configured": true, 00:16:23.959 "data_offset": 2048, 00:16:23.959 "data_size": 63488 00:16:23.959 }, 00:16:23.959 { 00:16:23.959 "name": null, 00:16:23.959 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:23.959 "is_configured": false, 00:16:23.959 "data_offset": 2048, 00:16:23.959 "data_size": 63488 00:16:23.959 }, 00:16:23.959 { 00:16:23.959 "name": null, 00:16:23.959 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:23.959 "is_configured": false, 00:16:23.959 "data_offset": 2048, 00:16:23.959 "data_size": 63488 00:16:23.959 }, 00:16:23.959 { 00:16:23.959 "name": null, 00:16:23.959 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:23.959 "is_configured": false, 00:16:23.959 "data_offset": 2048, 00:16:23.959 "data_size": 63488 00:16:23.959 } 00:16:23.959 ] 00:16:23.959 }' 00:16:23.959 11:52:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:23.959 11:52:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.528 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:16:24.528 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:24.528 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:24.788 [2024-05-14 11:52:51.680361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:24.788 [2024-05-14 11:52:51.680420] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.788 [2024-05-14 11:52:51.680443] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3fcf0 00:16:24.788 [2024-05-14 11:52:51.680452] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.788 [2024-05-14 11:52:51.680761] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.788 [2024-05-14 11:52:51.680777] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:24.788 [2024-05-14 11:52:51.680834] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:16:24.788 [2024-05-14 11:52:51.680849] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:24.788 pt2 00:16:24.788 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:24.788 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:24.788 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:25.047 [2024-05-14 11:52:51.924982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:25.047 [2024-05-14 11:52:51.925010] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.047 [2024-05-14 11:52:51.925025] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3f170 00:16:25.047 [2024-05-14 11:52:51.925034] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.047 [2024-05-14 11:52:51.925288] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.047 [2024-05-14 11:52:51.925300] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:25.047 [2024-05-14 11:52:51.925345] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:16:25.047 [2024-05-14 11:52:51.925357] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:25.047 pt3 00:16:25.047 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:25.048 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:25.048 11:52:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:25.308 [2024-05-14 11:52:52.177635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:25.308 [2024-05-14 11:52:52.177660] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.308 [2024-05-14 11:52:52.177674] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3e860 00:16:25.308 [2024-05-14 11:52:52.177682] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.308 [2024-05-14 11:52:52.177918] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.308 [2024-05-14 11:52:52.177930] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:25.308 [2024-05-14 11:52:52.177970] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:16:25.308 [2024-05-14 11:52:52.177983] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:25.308 [2024-05-14 11:52:52.178070] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xd3a850 00:16:25.308 [2024-05-14 11:52:52.178077] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:25.308 [2024-05-14 11:52:52.178211] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd43770 00:16:25.308 [2024-05-14 11:52:52.178302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd3a850 00:16:25.308 [2024-05-14 11:52:52.178308] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd3a850 00:16:25.308 [2024-05-14 11:52:52.178372] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:25.308 pt4 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.308 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:25.567 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:25.567 "name": "raid_bdev1", 00:16:25.567 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:25.567 "strip_size_kb": 64, 00:16:25.567 "state": "online", 00:16:25.567 "raid_level": "raid0", 00:16:25.567 "superblock": true, 00:16:25.567 "num_base_bdevs": 4, 00:16:25.567 "num_base_bdevs_discovered": 4, 00:16:25.567 "num_base_bdevs_operational": 4, 00:16:25.567 "base_bdevs_list": [ 00:16:25.567 { 00:16:25.567 "name": "pt1", 00:16:25.567 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:25.567 "is_configured": true, 00:16:25.567 "data_offset": 2048, 00:16:25.567 "data_size": 63488 00:16:25.567 }, 00:16:25.567 { 00:16:25.567 "name": "pt2", 00:16:25.567 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:25.567 "is_configured": true, 00:16:25.567 "data_offset": 2048, 00:16:25.567 "data_size": 63488 00:16:25.567 }, 00:16:25.567 { 00:16:25.567 "name": "pt3", 00:16:25.567 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:25.567 "is_configured": true, 00:16:25.567 "data_offset": 2048, 00:16:25.567 "data_size": 63488 00:16:25.567 }, 00:16:25.567 { 00:16:25.567 "name": "pt4", 00:16:25.567 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:25.567 "is_configured": true, 00:16:25.567 "data_offset": 2048, 00:16:25.567 "data_size": 63488 00:16:25.567 } 00:16:25.567 ] 00:16:25.567 }' 00:16:25.567 11:52:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:25.567 11:52:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:26.135 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:26.395 [2024-05-14 11:52:53.244577] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:26.395 "name": "raid_bdev1", 00:16:26.395 "aliases": [ 00:16:26.395 "0843e91a-eb87-421a-a365-859ad33316f1" 00:16:26.395 ], 00:16:26.395 "product_name": "Raid Volume", 00:16:26.395 "block_size": 512, 00:16:26.395 "num_blocks": 253952, 00:16:26.395 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:26.395 "assigned_rate_limits": { 00:16:26.395 "rw_ios_per_sec": 0, 00:16:26.395 "rw_mbytes_per_sec": 0, 00:16:26.395 "r_mbytes_per_sec": 0, 00:16:26.395 "w_mbytes_per_sec": 0 00:16:26.395 }, 00:16:26.395 "claimed": false, 00:16:26.395 "zoned": false, 00:16:26.395 "supported_io_types": { 00:16:26.395 "read": true, 00:16:26.395 "write": true, 00:16:26.395 "unmap": true, 00:16:26.395 "write_zeroes": true, 00:16:26.395 "flush": true, 00:16:26.395 "reset": true, 00:16:26.395 "compare": false, 00:16:26.395 "compare_and_write": false, 00:16:26.395 "abort": false, 00:16:26.395 "nvme_admin": false, 00:16:26.395 "nvme_io": false 00:16:26.395 }, 00:16:26.395 "memory_domains": [ 00:16:26.395 { 00:16:26.395 "dma_device_id": "system", 00:16:26.395 "dma_device_type": 1 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.395 "dma_device_type": 2 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "system", 00:16:26.395 "dma_device_type": 1 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.395 "dma_device_type": 2 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "system", 00:16:26.395 "dma_device_type": 1 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.395 "dma_device_type": 2 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "system", 00:16:26.395 "dma_device_type": 1 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.395 "dma_device_type": 2 00:16:26.395 } 00:16:26.395 ], 00:16:26.395 "driver_specific": { 00:16:26.395 "raid": { 00:16:26.395 "uuid": "0843e91a-eb87-421a-a365-859ad33316f1", 00:16:26.395 "strip_size_kb": 64, 00:16:26.395 "state": "online", 00:16:26.395 "raid_level": "raid0", 00:16:26.395 "superblock": true, 00:16:26.395 "num_base_bdevs": 4, 00:16:26.395 "num_base_bdevs_discovered": 4, 00:16:26.395 "num_base_bdevs_operational": 4, 00:16:26.395 "base_bdevs_list": [ 00:16:26.395 { 00:16:26.395 "name": "pt1", 00:16:26.395 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:26.395 "is_configured": true, 00:16:26.395 "data_offset": 2048, 00:16:26.395 "data_size": 63488 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "name": "pt2", 00:16:26.395 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:26.395 "is_configured": true, 00:16:26.395 "data_offset": 2048, 00:16:26.395 "data_size": 63488 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "name": "pt3", 00:16:26.395 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:26.395 "is_configured": true, 00:16:26.395 "data_offset": 2048, 00:16:26.395 "data_size": 63488 00:16:26.395 }, 00:16:26.395 { 00:16:26.395 "name": "pt4", 00:16:26.395 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:26.395 "is_configured": true, 00:16:26.395 "data_offset": 2048, 00:16:26.395 "data_size": 63488 00:16:26.395 } 00:16:26.395 ] 00:16:26.395 } 00:16:26.395 } 00:16:26.395 }' 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:26.395 pt2 00:16:26.395 pt3 00:16:26.395 pt4' 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:26.395 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:26.655 "name": "pt1", 00:16:26.655 "aliases": [ 00:16:26.655 "d6920e4e-5cae-5d71-81e8-fea0d4e1d044" 00:16:26.655 ], 00:16:26.655 "product_name": "passthru", 00:16:26.655 "block_size": 512, 00:16:26.655 "num_blocks": 65536, 00:16:26.655 "uuid": "d6920e4e-5cae-5d71-81e8-fea0d4e1d044", 00:16:26.655 "assigned_rate_limits": { 00:16:26.655 "rw_ios_per_sec": 0, 00:16:26.655 "rw_mbytes_per_sec": 0, 00:16:26.655 "r_mbytes_per_sec": 0, 00:16:26.655 "w_mbytes_per_sec": 0 00:16:26.655 }, 00:16:26.655 "claimed": true, 00:16:26.655 "claim_type": "exclusive_write", 00:16:26.655 "zoned": false, 00:16:26.655 "supported_io_types": { 00:16:26.655 "read": true, 00:16:26.655 "write": true, 00:16:26.655 "unmap": true, 00:16:26.655 "write_zeroes": true, 00:16:26.655 "flush": true, 00:16:26.655 "reset": true, 00:16:26.655 "compare": false, 00:16:26.655 "compare_and_write": false, 00:16:26.655 "abort": true, 00:16:26.655 "nvme_admin": false, 00:16:26.655 "nvme_io": false 00:16:26.655 }, 00:16:26.655 "memory_domains": [ 00:16:26.655 { 00:16:26.655 "dma_device_id": "system", 00:16:26.655 "dma_device_type": 1 00:16:26.655 }, 00:16:26.655 { 00:16:26.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.655 "dma_device_type": 2 00:16:26.655 } 00:16:26.655 ], 00:16:26.655 "driver_specific": { 00:16:26.655 "passthru": { 00:16:26.655 "name": "pt1", 00:16:26.655 "base_bdev_name": "malloc1" 00:16:26.655 } 00:16:26.655 } 00:16:26.655 }' 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.655 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:26.914 11:52:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:27.173 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:27.173 "name": "pt2", 00:16:27.174 "aliases": [ 00:16:27.174 "a3730f03-3fbe-53a7-ac46-975544db1cd5" 00:16:27.174 ], 00:16:27.174 "product_name": "passthru", 00:16:27.174 "block_size": 512, 00:16:27.174 "num_blocks": 65536, 00:16:27.174 "uuid": "a3730f03-3fbe-53a7-ac46-975544db1cd5", 00:16:27.174 "assigned_rate_limits": { 00:16:27.174 "rw_ios_per_sec": 0, 00:16:27.174 "rw_mbytes_per_sec": 0, 00:16:27.174 "r_mbytes_per_sec": 0, 00:16:27.174 "w_mbytes_per_sec": 0 00:16:27.174 }, 00:16:27.174 "claimed": true, 00:16:27.174 "claim_type": "exclusive_write", 00:16:27.174 "zoned": false, 00:16:27.174 "supported_io_types": { 00:16:27.174 "read": true, 00:16:27.174 "write": true, 00:16:27.174 "unmap": true, 00:16:27.174 "write_zeroes": true, 00:16:27.174 "flush": true, 00:16:27.174 "reset": true, 00:16:27.174 "compare": false, 00:16:27.174 "compare_and_write": false, 00:16:27.174 "abort": true, 00:16:27.174 "nvme_admin": false, 00:16:27.174 "nvme_io": false 00:16:27.174 }, 00:16:27.174 "memory_domains": [ 00:16:27.174 { 00:16:27.174 "dma_device_id": "system", 00:16:27.174 "dma_device_type": 1 00:16:27.174 }, 00:16:27.174 { 00:16:27.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.174 "dma_device_type": 2 00:16:27.174 } 00:16:27.174 ], 00:16:27.174 "driver_specific": { 00:16:27.174 "passthru": { 00:16:27.174 "name": "pt2", 00:16:27.174 "base_bdev_name": "malloc2" 00:16:27.174 } 00:16:27.174 } 00:16:27.174 }' 00:16:27.174 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:27.174 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:27.174 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:27.174 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:27.433 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:27.692 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:27.692 "name": "pt3", 00:16:27.692 "aliases": [ 00:16:27.692 "8cfe2fe6-18d0-5e80-a304-74222d818d91" 00:16:27.692 ], 00:16:27.692 "product_name": "passthru", 00:16:27.692 "block_size": 512, 00:16:27.692 "num_blocks": 65536, 00:16:27.692 "uuid": "8cfe2fe6-18d0-5e80-a304-74222d818d91", 00:16:27.692 "assigned_rate_limits": { 00:16:27.692 "rw_ios_per_sec": 0, 00:16:27.692 "rw_mbytes_per_sec": 0, 00:16:27.692 "r_mbytes_per_sec": 0, 00:16:27.692 "w_mbytes_per_sec": 0 00:16:27.692 }, 00:16:27.692 "claimed": true, 00:16:27.692 "claim_type": "exclusive_write", 00:16:27.692 "zoned": false, 00:16:27.692 "supported_io_types": { 00:16:27.692 "read": true, 00:16:27.692 "write": true, 00:16:27.692 "unmap": true, 00:16:27.692 "write_zeroes": true, 00:16:27.692 "flush": true, 00:16:27.692 "reset": true, 00:16:27.692 "compare": false, 00:16:27.692 "compare_and_write": false, 00:16:27.692 "abort": true, 00:16:27.692 "nvme_admin": false, 00:16:27.692 "nvme_io": false 00:16:27.692 }, 00:16:27.692 "memory_domains": [ 00:16:27.692 { 00:16:27.692 "dma_device_id": "system", 00:16:27.692 "dma_device_type": 1 00:16:27.692 }, 00:16:27.692 { 00:16:27.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.692 "dma_device_type": 2 00:16:27.692 } 00:16:27.692 ], 00:16:27.692 "driver_specific": { 00:16:27.692 "passthru": { 00:16:27.692 "name": "pt3", 00:16:27.692 "base_bdev_name": "malloc3" 00:16:27.692 } 00:16:27.692 } 00:16:27.692 }' 00:16:27.692 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.952 11:52:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.211 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.211 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:28.211 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:28.211 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:28.211 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:28.471 "name": "pt4", 00:16:28.471 "aliases": [ 00:16:28.471 "88933881-3d11-52e0-aef2-4e680cf3a6cf" 00:16:28.471 ], 00:16:28.471 "product_name": "passthru", 00:16:28.471 "block_size": 512, 00:16:28.471 "num_blocks": 65536, 00:16:28.471 "uuid": "88933881-3d11-52e0-aef2-4e680cf3a6cf", 00:16:28.471 "assigned_rate_limits": { 00:16:28.471 "rw_ios_per_sec": 0, 00:16:28.471 "rw_mbytes_per_sec": 0, 00:16:28.471 "r_mbytes_per_sec": 0, 00:16:28.471 "w_mbytes_per_sec": 0 00:16:28.471 }, 00:16:28.471 "claimed": true, 00:16:28.471 "claim_type": "exclusive_write", 00:16:28.471 "zoned": false, 00:16:28.471 "supported_io_types": { 00:16:28.471 "read": true, 00:16:28.471 "write": true, 00:16:28.471 "unmap": true, 00:16:28.471 "write_zeroes": true, 00:16:28.471 "flush": true, 00:16:28.471 "reset": true, 00:16:28.471 "compare": false, 00:16:28.471 "compare_and_write": false, 00:16:28.471 "abort": true, 00:16:28.471 "nvme_admin": false, 00:16:28.471 "nvme_io": false 00:16:28.471 }, 00:16:28.471 "memory_domains": [ 00:16:28.471 { 00:16:28.471 "dma_device_id": "system", 00:16:28.471 "dma_device_type": 1 00:16:28.471 }, 00:16:28.471 { 00:16:28.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.471 "dma_device_type": 2 00:16:28.471 } 00:16:28.471 ], 00:16:28.471 "driver_specific": { 00:16:28.471 "passthru": { 00:16:28.471 "name": "pt4", 00:16:28.471 "base_bdev_name": "malloc4" 00:16:28.471 } 00:16:28.471 } 00:16:28.471 }' 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:28.471 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:16:28.730 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:28.989 [2024-05-14 11:52:55.875366] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 0843e91a-eb87-421a-a365-859ad33316f1 '!=' 0843e91a-eb87-421a-a365-859ad33316f1 ']' 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1721909 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1721909 ']' 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1721909 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1721909 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1721909' 00:16:28.989 killing process with pid 1721909 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1721909 00:16:28.989 [2024-05-14 11:52:55.941341] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:28.989 [2024-05-14 11:52:55.941396] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:28.989 [2024-05-14 11:52:55.941468] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:28.989 [2024-05-14 11:52:55.941477] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd3a850 name raid_bdev1, state offline 00:16:28.989 11:52:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1721909 00:16:28.989 [2024-05-14 11:52:56.009803] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:29.559 11:52:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:16:29.559 00:16:29.559 real 0m16.164s 00:16:29.559 user 0m29.069s 00:16:29.559 sys 0m2.783s 00:16:29.559 11:52:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:29.559 11:52:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.559 ************************************ 00:16:29.559 END TEST raid_superblock_test 00:16:29.559 ************************************ 00:16:29.559 11:52:56 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:16:29.559 11:52:56 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:16:29.559 11:52:56 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:16:29.559 11:52:56 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:29.559 11:52:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:29.559 ************************************ 00:16:29.559 START TEST raid_state_function_test 00:16:29.559 ************************************ 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 false 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1724342 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1724342' 00:16:29.559 Process raid pid: 1724342 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1724342 /var/tmp/spdk-raid.sock 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1724342 ']' 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:29.559 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:29.559 11:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.559 [2024-05-14 11:52:56.534409] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:16:29.559 [2024-05-14 11:52:56.534478] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:29.819 [2024-05-14 11:52:56.666590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:29.819 [2024-05-14 11:52:56.768027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:29.819 [2024-05-14 11:52:56.832381] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:29.819 [2024-05-14 11:52:56.832432] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:30.387 11:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:30.387 11:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:16:30.387 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:30.649 [2024-05-14 11:52:57.683821] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:30.649 [2024-05-14 11:52:57.683865] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:30.649 [2024-05-14 11:52:57.683875] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:30.649 [2024-05-14 11:52:57.683887] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:30.649 [2024-05-14 11:52:57.683896] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:30.649 [2024-05-14 11:52:57.683908] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:30.649 [2024-05-14 11:52:57.683917] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:30.649 [2024-05-14 11:52:57.683927] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.649 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.973 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:30.973 "name": "Existed_Raid", 00:16:30.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.973 "strip_size_kb": 64, 00:16:30.973 "state": "configuring", 00:16:30.973 "raid_level": "concat", 00:16:30.973 "superblock": false, 00:16:30.973 "num_base_bdevs": 4, 00:16:30.973 "num_base_bdevs_discovered": 0, 00:16:30.973 "num_base_bdevs_operational": 4, 00:16:30.973 "base_bdevs_list": [ 00:16:30.973 { 00:16:30.973 "name": "BaseBdev1", 00:16:30.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.973 "is_configured": false, 00:16:30.973 "data_offset": 0, 00:16:30.973 "data_size": 0 00:16:30.973 }, 00:16:30.973 { 00:16:30.973 "name": "BaseBdev2", 00:16:30.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.973 "is_configured": false, 00:16:30.973 "data_offset": 0, 00:16:30.973 "data_size": 0 00:16:30.973 }, 00:16:30.973 { 00:16:30.973 "name": "BaseBdev3", 00:16:30.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.973 "is_configured": false, 00:16:30.973 "data_offset": 0, 00:16:30.973 "data_size": 0 00:16:30.973 }, 00:16:30.973 { 00:16:30.973 "name": "BaseBdev4", 00:16:30.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.973 "is_configured": false, 00:16:30.973 "data_offset": 0, 00:16:30.973 "data_size": 0 00:16:30.973 } 00:16:30.973 ] 00:16:30.973 }' 00:16:30.973 11:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:30.973 11:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.541 11:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.800 [2024-05-14 11:52:58.762756] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.800 [2024-05-14 11:52:58.762788] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd77720 name Existed_Raid, state configuring 00:16:31.800 11:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:32.059 [2024-05-14 11:52:59.003410] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:32.060 [2024-05-14 11:52:59.003438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:32.060 [2024-05-14 11:52:59.003449] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:32.060 [2024-05-14 11:52:59.003460] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:32.060 [2024-05-14 11:52:59.003469] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:32.060 [2024-05-14 11:52:59.003480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:32.060 [2024-05-14 11:52:59.003489] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:32.060 [2024-05-14 11:52:59.003501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:32.060 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:32.319 [2024-05-14 11:52:59.241889] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:32.319 BaseBdev1 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:32.319 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.577 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:32.836 [ 00:16:32.836 { 00:16:32.836 "name": "BaseBdev1", 00:16:32.836 "aliases": [ 00:16:32.836 "1ae0626c-d7d1-4fed-a51e-1f968e7693b3" 00:16:32.836 ], 00:16:32.836 "product_name": "Malloc disk", 00:16:32.836 "block_size": 512, 00:16:32.836 "num_blocks": 65536, 00:16:32.836 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:32.836 "assigned_rate_limits": { 00:16:32.836 "rw_ios_per_sec": 0, 00:16:32.836 "rw_mbytes_per_sec": 0, 00:16:32.836 "r_mbytes_per_sec": 0, 00:16:32.836 "w_mbytes_per_sec": 0 00:16:32.836 }, 00:16:32.836 "claimed": true, 00:16:32.836 "claim_type": "exclusive_write", 00:16:32.836 "zoned": false, 00:16:32.836 "supported_io_types": { 00:16:32.836 "read": true, 00:16:32.836 "write": true, 00:16:32.836 "unmap": true, 00:16:32.836 "write_zeroes": true, 00:16:32.836 "flush": true, 00:16:32.836 "reset": true, 00:16:32.836 "compare": false, 00:16:32.836 "compare_and_write": false, 00:16:32.836 "abort": true, 00:16:32.836 "nvme_admin": false, 00:16:32.836 "nvme_io": false 00:16:32.836 }, 00:16:32.836 "memory_domains": [ 00:16:32.836 { 00:16:32.836 "dma_device_id": "system", 00:16:32.836 "dma_device_type": 1 00:16:32.836 }, 00:16:32.836 { 00:16:32.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.836 "dma_device_type": 2 00:16:32.836 } 00:16:32.836 ], 00:16:32.836 "driver_specific": {} 00:16:32.836 } 00:16:32.836 ] 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.836 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.096 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:33.096 "name": "Existed_Raid", 00:16:33.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.096 "strip_size_kb": 64, 00:16:33.096 "state": "configuring", 00:16:33.096 "raid_level": "concat", 00:16:33.096 "superblock": false, 00:16:33.096 "num_base_bdevs": 4, 00:16:33.096 "num_base_bdevs_discovered": 1, 00:16:33.096 "num_base_bdevs_operational": 4, 00:16:33.096 "base_bdevs_list": [ 00:16:33.096 { 00:16:33.096 "name": "BaseBdev1", 00:16:33.096 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:33.096 "is_configured": true, 00:16:33.096 "data_offset": 0, 00:16:33.096 "data_size": 65536 00:16:33.096 }, 00:16:33.096 { 00:16:33.096 "name": "BaseBdev2", 00:16:33.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.096 "is_configured": false, 00:16:33.096 "data_offset": 0, 00:16:33.096 "data_size": 0 00:16:33.096 }, 00:16:33.096 { 00:16:33.096 "name": "BaseBdev3", 00:16:33.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.096 "is_configured": false, 00:16:33.096 "data_offset": 0, 00:16:33.096 "data_size": 0 00:16:33.096 }, 00:16:33.096 { 00:16:33.096 "name": "BaseBdev4", 00:16:33.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.096 "is_configured": false, 00:16:33.096 "data_offset": 0, 00:16:33.096 "data_size": 0 00:16:33.096 } 00:16:33.096 ] 00:16:33.096 }' 00:16:33.096 11:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:33.096 11:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.663 11:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.921 [2024-05-14 11:53:00.798222] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.921 [2024-05-14 11:53:00.798269] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd76fb0 name Existed_Raid, state configuring 00:16:33.921 11:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:34.180 [2024-05-14 11:53:01.054932] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:34.180 [2024-05-14 11:53:01.056599] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:34.180 [2024-05-14 11:53:01.056633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:34.180 [2024-05-14 11:53:01.056643] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:34.180 [2024-05-14 11:53:01.056655] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:34.180 [2024-05-14 11:53:01.056664] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:34.180 [2024-05-14 11:53:01.056676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.180 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.439 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:34.439 "name": "Existed_Raid", 00:16:34.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.439 "strip_size_kb": 64, 00:16:34.439 "state": "configuring", 00:16:34.439 "raid_level": "concat", 00:16:34.439 "superblock": false, 00:16:34.439 "num_base_bdevs": 4, 00:16:34.439 "num_base_bdevs_discovered": 1, 00:16:34.439 "num_base_bdevs_operational": 4, 00:16:34.439 "base_bdevs_list": [ 00:16:34.439 { 00:16:34.439 "name": "BaseBdev1", 00:16:34.439 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:34.439 "is_configured": true, 00:16:34.439 "data_offset": 0, 00:16:34.439 "data_size": 65536 00:16:34.439 }, 00:16:34.439 { 00:16:34.439 "name": "BaseBdev2", 00:16:34.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.439 "is_configured": false, 00:16:34.439 "data_offset": 0, 00:16:34.439 "data_size": 0 00:16:34.439 }, 00:16:34.439 { 00:16:34.439 "name": "BaseBdev3", 00:16:34.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.439 "is_configured": false, 00:16:34.439 "data_offset": 0, 00:16:34.439 "data_size": 0 00:16:34.439 }, 00:16:34.439 { 00:16:34.439 "name": "BaseBdev4", 00:16:34.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.439 "is_configured": false, 00:16:34.439 "data_offset": 0, 00:16:34.439 "data_size": 0 00:16:34.439 } 00:16:34.439 ] 00:16:34.439 }' 00:16:34.439 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:34.439 11:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.007 11:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:35.007 [2024-05-14 11:53:02.082604] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:35.007 BaseBdev2 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:35.266 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:35.524 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:35.524 [ 00:16:35.524 { 00:16:35.524 "name": "BaseBdev2", 00:16:35.524 "aliases": [ 00:16:35.524 "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68" 00:16:35.524 ], 00:16:35.524 "product_name": "Malloc disk", 00:16:35.524 "block_size": 512, 00:16:35.524 "num_blocks": 65536, 00:16:35.524 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:35.524 "assigned_rate_limits": { 00:16:35.524 "rw_ios_per_sec": 0, 00:16:35.524 "rw_mbytes_per_sec": 0, 00:16:35.524 "r_mbytes_per_sec": 0, 00:16:35.524 "w_mbytes_per_sec": 0 00:16:35.524 }, 00:16:35.524 "claimed": true, 00:16:35.524 "claim_type": "exclusive_write", 00:16:35.524 "zoned": false, 00:16:35.524 "supported_io_types": { 00:16:35.524 "read": true, 00:16:35.524 "write": true, 00:16:35.524 "unmap": true, 00:16:35.524 "write_zeroes": true, 00:16:35.524 "flush": true, 00:16:35.524 "reset": true, 00:16:35.524 "compare": false, 00:16:35.524 "compare_and_write": false, 00:16:35.524 "abort": true, 00:16:35.524 "nvme_admin": false, 00:16:35.524 "nvme_io": false 00:16:35.524 }, 00:16:35.524 "memory_domains": [ 00:16:35.524 { 00:16:35.524 "dma_device_id": "system", 00:16:35.524 "dma_device_type": 1 00:16:35.524 }, 00:16:35.524 { 00:16:35.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.524 "dma_device_type": 2 00:16:35.524 } 00:16:35.524 ], 00:16:35.524 "driver_specific": {} 00:16:35.524 } 00:16:35.524 ] 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:35.783 "name": "Existed_Raid", 00:16:35.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.783 "strip_size_kb": 64, 00:16:35.783 "state": "configuring", 00:16:35.783 "raid_level": "concat", 00:16:35.783 "superblock": false, 00:16:35.783 "num_base_bdevs": 4, 00:16:35.783 "num_base_bdevs_discovered": 2, 00:16:35.783 "num_base_bdevs_operational": 4, 00:16:35.783 "base_bdevs_list": [ 00:16:35.783 { 00:16:35.783 "name": "BaseBdev1", 00:16:35.783 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:35.783 "is_configured": true, 00:16:35.783 "data_offset": 0, 00:16:35.783 "data_size": 65536 00:16:35.783 }, 00:16:35.783 { 00:16:35.783 "name": "BaseBdev2", 00:16:35.783 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:35.783 "is_configured": true, 00:16:35.783 "data_offset": 0, 00:16:35.783 "data_size": 65536 00:16:35.783 }, 00:16:35.783 { 00:16:35.783 "name": "BaseBdev3", 00:16:35.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.783 "is_configured": false, 00:16:35.783 "data_offset": 0, 00:16:35.783 "data_size": 0 00:16:35.783 }, 00:16:35.783 { 00:16:35.783 "name": "BaseBdev4", 00:16:35.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.783 "is_configured": false, 00:16:35.783 "data_offset": 0, 00:16:35.783 "data_size": 0 00:16:35.783 } 00:16:35.783 ] 00:16:35.783 }' 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:35.783 11:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.352 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:36.611 [2024-05-14 11:53:03.527433] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:36.611 BaseBdev3 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:36.611 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.869 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:36.869 [ 00:16:36.869 { 00:16:36.869 "name": "BaseBdev3", 00:16:36.869 "aliases": [ 00:16:36.869 "d2eb0746-794b-460a-b627-4f336b0ae93e" 00:16:36.869 ], 00:16:36.869 "product_name": "Malloc disk", 00:16:36.869 "block_size": 512, 00:16:36.869 "num_blocks": 65536, 00:16:36.869 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:36.869 "assigned_rate_limits": { 00:16:36.869 "rw_ios_per_sec": 0, 00:16:36.869 "rw_mbytes_per_sec": 0, 00:16:36.869 "r_mbytes_per_sec": 0, 00:16:36.869 "w_mbytes_per_sec": 0 00:16:36.869 }, 00:16:36.869 "claimed": true, 00:16:36.869 "claim_type": "exclusive_write", 00:16:36.869 "zoned": false, 00:16:36.869 "supported_io_types": { 00:16:36.869 "read": true, 00:16:36.869 "write": true, 00:16:36.869 "unmap": true, 00:16:36.869 "write_zeroes": true, 00:16:36.869 "flush": true, 00:16:36.869 "reset": true, 00:16:36.869 "compare": false, 00:16:36.869 "compare_and_write": false, 00:16:36.869 "abort": true, 00:16:36.869 "nvme_admin": false, 00:16:36.869 "nvme_io": false 00:16:36.869 }, 00:16:36.869 "memory_domains": [ 00:16:36.869 { 00:16:36.869 "dma_device_id": "system", 00:16:36.869 "dma_device_type": 1 00:16:36.869 }, 00:16:36.869 { 00:16:36.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.869 "dma_device_type": 2 00:16:36.869 } 00:16:36.869 ], 00:16:36.869 "driver_specific": {} 00:16:36.869 } 00:16:36.869 ] 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.128 11:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.388 11:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:37.388 "name": "Existed_Raid", 00:16:37.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.388 "strip_size_kb": 64, 00:16:37.388 "state": "configuring", 00:16:37.388 "raid_level": "concat", 00:16:37.388 "superblock": false, 00:16:37.388 "num_base_bdevs": 4, 00:16:37.388 "num_base_bdevs_discovered": 3, 00:16:37.388 "num_base_bdevs_operational": 4, 00:16:37.388 "base_bdevs_list": [ 00:16:37.388 { 00:16:37.388 "name": "BaseBdev1", 00:16:37.388 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:37.388 "is_configured": true, 00:16:37.388 "data_offset": 0, 00:16:37.388 "data_size": 65536 00:16:37.388 }, 00:16:37.388 { 00:16:37.388 "name": "BaseBdev2", 00:16:37.388 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:37.388 "is_configured": true, 00:16:37.388 "data_offset": 0, 00:16:37.388 "data_size": 65536 00:16:37.388 }, 00:16:37.388 { 00:16:37.388 "name": "BaseBdev3", 00:16:37.388 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:37.388 "is_configured": true, 00:16:37.388 "data_offset": 0, 00:16:37.388 "data_size": 65536 00:16:37.388 }, 00:16:37.388 { 00:16:37.388 "name": "BaseBdev4", 00:16:37.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.388 "is_configured": false, 00:16:37.388 "data_offset": 0, 00:16:37.388 "data_size": 0 00:16:37.388 } 00:16:37.388 ] 00:16:37.388 }' 00:16:37.388 11:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:37.388 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:37.957 [2024-05-14 11:53:04.972300] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:37.957 [2024-05-14 11:53:04.972337] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xd781b0 00:16:37.957 [2024-05-14 11:53:04.972346] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:37.957 [2024-05-14 11:53:04.972570] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd79860 00:16:37.957 [2024-05-14 11:53:04.972705] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd781b0 00:16:37.957 [2024-05-14 11:53:04.972715] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd781b0 00:16:37.957 [2024-05-14 11:53:04.972884] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:37.957 BaseBdev4 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:37.957 11:53:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.216 11:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:38.475 [ 00:16:38.475 { 00:16:38.475 "name": "BaseBdev4", 00:16:38.475 "aliases": [ 00:16:38.475 "a5a2abe9-56ba-417c-b287-a7d42e6dabed" 00:16:38.475 ], 00:16:38.475 "product_name": "Malloc disk", 00:16:38.475 "block_size": 512, 00:16:38.475 "num_blocks": 65536, 00:16:38.475 "uuid": "a5a2abe9-56ba-417c-b287-a7d42e6dabed", 00:16:38.475 "assigned_rate_limits": { 00:16:38.475 "rw_ios_per_sec": 0, 00:16:38.475 "rw_mbytes_per_sec": 0, 00:16:38.475 "r_mbytes_per_sec": 0, 00:16:38.475 "w_mbytes_per_sec": 0 00:16:38.475 }, 00:16:38.475 "claimed": true, 00:16:38.475 "claim_type": "exclusive_write", 00:16:38.475 "zoned": false, 00:16:38.475 "supported_io_types": { 00:16:38.475 "read": true, 00:16:38.475 "write": true, 00:16:38.475 "unmap": true, 00:16:38.475 "write_zeroes": true, 00:16:38.475 "flush": true, 00:16:38.475 "reset": true, 00:16:38.475 "compare": false, 00:16:38.475 "compare_and_write": false, 00:16:38.475 "abort": true, 00:16:38.475 "nvme_admin": false, 00:16:38.475 "nvme_io": false 00:16:38.475 }, 00:16:38.475 "memory_domains": [ 00:16:38.475 { 00:16:38.475 "dma_device_id": "system", 00:16:38.475 "dma_device_type": 1 00:16:38.475 }, 00:16:38.475 { 00:16:38.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.475 "dma_device_type": 2 00:16:38.475 } 00:16:38.475 ], 00:16:38.475 "driver_specific": {} 00:16:38.475 } 00:16:38.475 ] 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.475 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.734 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:38.734 "name": "Existed_Raid", 00:16:38.734 "uuid": "f2b23e95-fd57-4cde-81df-1100eab9e7d6", 00:16:38.734 "strip_size_kb": 64, 00:16:38.734 "state": "online", 00:16:38.734 "raid_level": "concat", 00:16:38.734 "superblock": false, 00:16:38.734 "num_base_bdevs": 4, 00:16:38.734 "num_base_bdevs_discovered": 4, 00:16:38.734 "num_base_bdevs_operational": 4, 00:16:38.734 "base_bdevs_list": [ 00:16:38.734 { 00:16:38.734 "name": "BaseBdev1", 00:16:38.734 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:38.734 "is_configured": true, 00:16:38.734 "data_offset": 0, 00:16:38.734 "data_size": 65536 00:16:38.734 }, 00:16:38.734 { 00:16:38.734 "name": "BaseBdev2", 00:16:38.734 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:38.734 "is_configured": true, 00:16:38.734 "data_offset": 0, 00:16:38.734 "data_size": 65536 00:16:38.734 }, 00:16:38.734 { 00:16:38.734 "name": "BaseBdev3", 00:16:38.734 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:38.734 "is_configured": true, 00:16:38.734 "data_offset": 0, 00:16:38.734 "data_size": 65536 00:16:38.734 }, 00:16:38.734 { 00:16:38.734 "name": "BaseBdev4", 00:16:38.734 "uuid": "a5a2abe9-56ba-417c-b287-a7d42e6dabed", 00:16:38.734 "is_configured": true, 00:16:38.734 "data_offset": 0, 00:16:38.734 "data_size": 65536 00:16:38.734 } 00:16:38.734 ] 00:16:38.734 }' 00:16:38.734 11:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:38.734 11:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:39.303 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:39.304 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:39.304 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:39.561 [2024-05-14 11:53:06.512673] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:39.562 "name": "Existed_Raid", 00:16:39.562 "aliases": [ 00:16:39.562 "f2b23e95-fd57-4cde-81df-1100eab9e7d6" 00:16:39.562 ], 00:16:39.562 "product_name": "Raid Volume", 00:16:39.562 "block_size": 512, 00:16:39.562 "num_blocks": 262144, 00:16:39.562 "uuid": "f2b23e95-fd57-4cde-81df-1100eab9e7d6", 00:16:39.562 "assigned_rate_limits": { 00:16:39.562 "rw_ios_per_sec": 0, 00:16:39.562 "rw_mbytes_per_sec": 0, 00:16:39.562 "r_mbytes_per_sec": 0, 00:16:39.562 "w_mbytes_per_sec": 0 00:16:39.562 }, 00:16:39.562 "claimed": false, 00:16:39.562 "zoned": false, 00:16:39.562 "supported_io_types": { 00:16:39.562 "read": true, 00:16:39.562 "write": true, 00:16:39.562 "unmap": true, 00:16:39.562 "write_zeroes": true, 00:16:39.562 "flush": true, 00:16:39.562 "reset": true, 00:16:39.562 "compare": false, 00:16:39.562 "compare_and_write": false, 00:16:39.562 "abort": false, 00:16:39.562 "nvme_admin": false, 00:16:39.562 "nvme_io": false 00:16:39.562 }, 00:16:39.562 "memory_domains": [ 00:16:39.562 { 00:16:39.562 "dma_device_id": "system", 00:16:39.562 "dma_device_type": 1 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.562 "dma_device_type": 2 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "system", 00:16:39.562 "dma_device_type": 1 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.562 "dma_device_type": 2 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "system", 00:16:39.562 "dma_device_type": 1 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.562 "dma_device_type": 2 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "system", 00:16:39.562 "dma_device_type": 1 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.562 "dma_device_type": 2 00:16:39.562 } 00:16:39.562 ], 00:16:39.562 "driver_specific": { 00:16:39.562 "raid": { 00:16:39.562 "uuid": "f2b23e95-fd57-4cde-81df-1100eab9e7d6", 00:16:39.562 "strip_size_kb": 64, 00:16:39.562 "state": "online", 00:16:39.562 "raid_level": "concat", 00:16:39.562 "superblock": false, 00:16:39.562 "num_base_bdevs": 4, 00:16:39.562 "num_base_bdevs_discovered": 4, 00:16:39.562 "num_base_bdevs_operational": 4, 00:16:39.562 "base_bdevs_list": [ 00:16:39.562 { 00:16:39.562 "name": "BaseBdev1", 00:16:39.562 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:39.562 "is_configured": true, 00:16:39.562 "data_offset": 0, 00:16:39.562 "data_size": 65536 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "name": "BaseBdev2", 00:16:39.562 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:39.562 "is_configured": true, 00:16:39.562 "data_offset": 0, 00:16:39.562 "data_size": 65536 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "name": "BaseBdev3", 00:16:39.562 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:39.562 "is_configured": true, 00:16:39.562 "data_offset": 0, 00:16:39.562 "data_size": 65536 00:16:39.562 }, 00:16:39.562 { 00:16:39.562 "name": "BaseBdev4", 00:16:39.562 "uuid": "a5a2abe9-56ba-417c-b287-a7d42e6dabed", 00:16:39.562 "is_configured": true, 00:16:39.562 "data_offset": 0, 00:16:39.562 "data_size": 65536 00:16:39.562 } 00:16:39.562 ] 00:16:39.562 } 00:16:39.562 } 00:16:39.562 }' 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:16:39.562 BaseBdev2 00:16:39.562 BaseBdev3 00:16:39.562 BaseBdev4' 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:39.562 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:39.821 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:39.821 "name": "BaseBdev1", 00:16:39.821 "aliases": [ 00:16:39.821 "1ae0626c-d7d1-4fed-a51e-1f968e7693b3" 00:16:39.821 ], 00:16:39.821 "product_name": "Malloc disk", 00:16:39.821 "block_size": 512, 00:16:39.821 "num_blocks": 65536, 00:16:39.821 "uuid": "1ae0626c-d7d1-4fed-a51e-1f968e7693b3", 00:16:39.821 "assigned_rate_limits": { 00:16:39.821 "rw_ios_per_sec": 0, 00:16:39.821 "rw_mbytes_per_sec": 0, 00:16:39.821 "r_mbytes_per_sec": 0, 00:16:39.821 "w_mbytes_per_sec": 0 00:16:39.821 }, 00:16:39.821 "claimed": true, 00:16:39.821 "claim_type": "exclusive_write", 00:16:39.821 "zoned": false, 00:16:39.821 "supported_io_types": { 00:16:39.821 "read": true, 00:16:39.821 "write": true, 00:16:39.821 "unmap": true, 00:16:39.821 "write_zeroes": true, 00:16:39.821 "flush": true, 00:16:39.821 "reset": true, 00:16:39.821 "compare": false, 00:16:39.821 "compare_and_write": false, 00:16:39.821 "abort": true, 00:16:39.821 "nvme_admin": false, 00:16:39.821 "nvme_io": false 00:16:39.821 }, 00:16:39.821 "memory_domains": [ 00:16:39.821 { 00:16:39.821 "dma_device_id": "system", 00:16:39.821 "dma_device_type": 1 00:16:39.821 }, 00:16:39.821 { 00:16:39.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.821 "dma_device_type": 2 00:16:39.821 } 00:16:39.821 ], 00:16:39.821 "driver_specific": {} 00:16:39.821 }' 00:16:39.821 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:39.821 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:40.081 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:40.081 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:40.081 11:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:40.081 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:40.340 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:40.340 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:40.340 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:40.340 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:40.340 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:40.340 "name": "BaseBdev2", 00:16:40.340 "aliases": [ 00:16:40.340 "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68" 00:16:40.340 ], 00:16:40.340 "product_name": "Malloc disk", 00:16:40.340 "block_size": 512, 00:16:40.340 "num_blocks": 65536, 00:16:40.340 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:40.340 "assigned_rate_limits": { 00:16:40.340 "rw_ios_per_sec": 0, 00:16:40.340 "rw_mbytes_per_sec": 0, 00:16:40.340 "r_mbytes_per_sec": 0, 00:16:40.340 "w_mbytes_per_sec": 0 00:16:40.340 }, 00:16:40.340 "claimed": true, 00:16:40.340 "claim_type": "exclusive_write", 00:16:40.340 "zoned": false, 00:16:40.340 "supported_io_types": { 00:16:40.340 "read": true, 00:16:40.340 "write": true, 00:16:40.340 "unmap": true, 00:16:40.340 "write_zeroes": true, 00:16:40.340 "flush": true, 00:16:40.340 "reset": true, 00:16:40.340 "compare": false, 00:16:40.340 "compare_and_write": false, 00:16:40.340 "abort": true, 00:16:40.340 "nvme_admin": false, 00:16:40.340 "nvme_io": false 00:16:40.340 }, 00:16:40.340 "memory_domains": [ 00:16:40.340 { 00:16:40.341 "dma_device_id": "system", 00:16:40.341 "dma_device_type": 1 00:16:40.341 }, 00:16:40.341 { 00:16:40.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.341 "dma_device_type": 2 00:16:40.341 } 00:16:40.341 ], 00:16:40.341 "driver_specific": {} 00:16:40.341 }' 00:16:40.341 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.599 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:40.856 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:40.857 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:40.857 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:40.857 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:40.857 11:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:41.115 "name": "BaseBdev3", 00:16:41.115 "aliases": [ 00:16:41.115 "d2eb0746-794b-460a-b627-4f336b0ae93e" 00:16:41.115 ], 00:16:41.115 "product_name": "Malloc disk", 00:16:41.115 "block_size": 512, 00:16:41.115 "num_blocks": 65536, 00:16:41.115 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:41.115 "assigned_rate_limits": { 00:16:41.115 "rw_ios_per_sec": 0, 00:16:41.115 "rw_mbytes_per_sec": 0, 00:16:41.115 "r_mbytes_per_sec": 0, 00:16:41.115 "w_mbytes_per_sec": 0 00:16:41.115 }, 00:16:41.115 "claimed": true, 00:16:41.115 "claim_type": "exclusive_write", 00:16:41.115 "zoned": false, 00:16:41.115 "supported_io_types": { 00:16:41.115 "read": true, 00:16:41.115 "write": true, 00:16:41.115 "unmap": true, 00:16:41.115 "write_zeroes": true, 00:16:41.115 "flush": true, 00:16:41.115 "reset": true, 00:16:41.115 "compare": false, 00:16:41.115 "compare_and_write": false, 00:16:41.115 "abort": true, 00:16:41.115 "nvme_admin": false, 00:16:41.115 "nvme_io": false 00:16:41.115 }, 00:16:41.115 "memory_domains": [ 00:16:41.115 { 00:16:41.115 "dma_device_id": "system", 00:16:41.115 "dma_device_type": 1 00:16:41.115 }, 00:16:41.115 { 00:16:41.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.115 "dma_device_type": 2 00:16:41.115 } 00:16:41.115 ], 00:16:41.115 "driver_specific": {} 00:16:41.115 }' 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:41.115 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:41.373 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:41.631 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:41.631 "name": "BaseBdev4", 00:16:41.631 "aliases": [ 00:16:41.631 "a5a2abe9-56ba-417c-b287-a7d42e6dabed" 00:16:41.631 ], 00:16:41.631 "product_name": "Malloc disk", 00:16:41.631 "block_size": 512, 00:16:41.631 "num_blocks": 65536, 00:16:41.631 "uuid": "a5a2abe9-56ba-417c-b287-a7d42e6dabed", 00:16:41.631 "assigned_rate_limits": { 00:16:41.631 "rw_ios_per_sec": 0, 00:16:41.631 "rw_mbytes_per_sec": 0, 00:16:41.631 "r_mbytes_per_sec": 0, 00:16:41.631 "w_mbytes_per_sec": 0 00:16:41.631 }, 00:16:41.631 "claimed": true, 00:16:41.631 "claim_type": "exclusive_write", 00:16:41.631 "zoned": false, 00:16:41.631 "supported_io_types": { 00:16:41.631 "read": true, 00:16:41.631 "write": true, 00:16:41.631 "unmap": true, 00:16:41.631 "write_zeroes": true, 00:16:41.631 "flush": true, 00:16:41.631 "reset": true, 00:16:41.631 "compare": false, 00:16:41.631 "compare_and_write": false, 00:16:41.631 "abort": true, 00:16:41.631 "nvme_admin": false, 00:16:41.631 "nvme_io": false 00:16:41.631 }, 00:16:41.631 "memory_domains": [ 00:16:41.631 { 00:16:41.631 "dma_device_id": "system", 00:16:41.631 "dma_device_type": 1 00:16:41.631 }, 00:16:41.631 { 00:16:41.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.631 "dma_device_type": 2 00:16:41.631 } 00:16:41.631 ], 00:16:41.631 "driver_specific": {} 00:16:41.631 }' 00:16:41.631 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:41.631 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:41.631 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:41.632 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:41.890 11:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:42.149 [2024-05-14 11:53:09.191536] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:42.149 [2024-05-14 11:53:09.191562] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:42.149 [2024-05-14 11:53:09.191610] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.149 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.408 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:42.408 "name": "Existed_Raid", 00:16:42.408 "uuid": "f2b23e95-fd57-4cde-81df-1100eab9e7d6", 00:16:42.408 "strip_size_kb": 64, 00:16:42.408 "state": "offline", 00:16:42.408 "raid_level": "concat", 00:16:42.408 "superblock": false, 00:16:42.408 "num_base_bdevs": 4, 00:16:42.408 "num_base_bdevs_discovered": 3, 00:16:42.408 "num_base_bdevs_operational": 3, 00:16:42.408 "base_bdevs_list": [ 00:16:42.408 { 00:16:42.408 "name": null, 00:16:42.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.408 "is_configured": false, 00:16:42.408 "data_offset": 0, 00:16:42.408 "data_size": 65536 00:16:42.408 }, 00:16:42.408 { 00:16:42.408 "name": "BaseBdev2", 00:16:42.408 "uuid": "27e2ec3b-5a1c-44c2-ab65-dcdb8952ee68", 00:16:42.408 "is_configured": true, 00:16:42.408 "data_offset": 0, 00:16:42.408 "data_size": 65536 00:16:42.408 }, 00:16:42.408 { 00:16:42.408 "name": "BaseBdev3", 00:16:42.408 "uuid": "d2eb0746-794b-460a-b627-4f336b0ae93e", 00:16:42.408 "is_configured": true, 00:16:42.408 "data_offset": 0, 00:16:42.408 "data_size": 65536 00:16:42.408 }, 00:16:42.408 { 00:16:42.408 "name": "BaseBdev4", 00:16:42.408 "uuid": "a5a2abe9-56ba-417c-b287-a7d42e6dabed", 00:16:42.408 "is_configured": true, 00:16:42.408 "data_offset": 0, 00:16:42.408 "data_size": 65536 00:16:42.408 } 00:16:42.408 ] 00:16:42.408 }' 00:16:42.408 11:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:42.408 11:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.974 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:16:42.974 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:42.974 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.975 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:43.234 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:43.234 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:43.234 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:43.492 [2024-05-14 11:53:10.507232] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:43.492 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:43.492 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:43.492 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.492 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:43.751 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:43.751 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:43.751 11:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:44.010 [2024-05-14 11:53:11.021964] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:44.010 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:44.010 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:44.010 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.010 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:44.269 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:44.269 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:44.269 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:44.527 [2024-05-14 11:53:11.544654] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:44.527 [2024-05-14 11:53:11.544693] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd781b0 name Existed_Raid, state offline 00:16:44.527 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:44.527 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:44.528 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.528 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:44.804 11:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:45.101 BaseBdev2 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:45.101 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.369 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:45.629 [ 00:16:45.629 { 00:16:45.629 "name": "BaseBdev2", 00:16:45.629 "aliases": [ 00:16:45.629 "c4795920-6b92-44b1-adc8-9ccaa04b7121" 00:16:45.629 ], 00:16:45.629 "product_name": "Malloc disk", 00:16:45.629 "block_size": 512, 00:16:45.629 "num_blocks": 65536, 00:16:45.629 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:45.629 "assigned_rate_limits": { 00:16:45.629 "rw_ios_per_sec": 0, 00:16:45.629 "rw_mbytes_per_sec": 0, 00:16:45.629 "r_mbytes_per_sec": 0, 00:16:45.629 "w_mbytes_per_sec": 0 00:16:45.629 }, 00:16:45.629 "claimed": false, 00:16:45.629 "zoned": false, 00:16:45.629 "supported_io_types": { 00:16:45.629 "read": true, 00:16:45.629 "write": true, 00:16:45.629 "unmap": true, 00:16:45.629 "write_zeroes": true, 00:16:45.629 "flush": true, 00:16:45.629 "reset": true, 00:16:45.629 "compare": false, 00:16:45.629 "compare_and_write": false, 00:16:45.629 "abort": true, 00:16:45.629 "nvme_admin": false, 00:16:45.629 "nvme_io": false 00:16:45.629 }, 00:16:45.629 "memory_domains": [ 00:16:45.629 { 00:16:45.629 "dma_device_id": "system", 00:16:45.629 "dma_device_type": 1 00:16:45.629 }, 00:16:45.629 { 00:16:45.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.629 "dma_device_type": 2 00:16:45.629 } 00:16:45.629 ], 00:16:45.629 "driver_specific": {} 00:16:45.629 } 00:16:45.629 ] 00:16:45.629 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:45.629 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:45.629 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:45.629 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:45.888 BaseBdev3 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:45.888 11:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:46.147 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:46.406 [ 00:16:46.406 { 00:16:46.406 "name": "BaseBdev3", 00:16:46.406 "aliases": [ 00:16:46.406 "b6ecef25-8b81-4637-9b6c-1b95a91c1cce" 00:16:46.406 ], 00:16:46.406 "product_name": "Malloc disk", 00:16:46.406 "block_size": 512, 00:16:46.406 "num_blocks": 65536, 00:16:46.406 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:46.406 "assigned_rate_limits": { 00:16:46.406 "rw_ios_per_sec": 0, 00:16:46.406 "rw_mbytes_per_sec": 0, 00:16:46.406 "r_mbytes_per_sec": 0, 00:16:46.406 "w_mbytes_per_sec": 0 00:16:46.406 }, 00:16:46.406 "claimed": false, 00:16:46.407 "zoned": false, 00:16:46.407 "supported_io_types": { 00:16:46.407 "read": true, 00:16:46.407 "write": true, 00:16:46.407 "unmap": true, 00:16:46.407 "write_zeroes": true, 00:16:46.407 "flush": true, 00:16:46.407 "reset": true, 00:16:46.407 "compare": false, 00:16:46.407 "compare_and_write": false, 00:16:46.407 "abort": true, 00:16:46.407 "nvme_admin": false, 00:16:46.407 "nvme_io": false 00:16:46.407 }, 00:16:46.407 "memory_domains": [ 00:16:46.407 { 00:16:46.407 "dma_device_id": "system", 00:16:46.407 "dma_device_type": 1 00:16:46.407 }, 00:16:46.407 { 00:16:46.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.407 "dma_device_type": 2 00:16:46.407 } 00:16:46.407 ], 00:16:46.407 "driver_specific": {} 00:16:46.407 } 00:16:46.407 ] 00:16:46.407 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:46.407 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:46.407 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:46.407 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:46.666 BaseBdev4 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:46.666 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:46.925 11:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:47.184 [ 00:16:47.184 { 00:16:47.184 "name": "BaseBdev4", 00:16:47.184 "aliases": [ 00:16:47.184 "2dcaf410-10f5-41f7-b946-ab70cabd6dab" 00:16:47.184 ], 00:16:47.184 "product_name": "Malloc disk", 00:16:47.184 "block_size": 512, 00:16:47.184 "num_blocks": 65536, 00:16:47.184 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:47.184 "assigned_rate_limits": { 00:16:47.184 "rw_ios_per_sec": 0, 00:16:47.184 "rw_mbytes_per_sec": 0, 00:16:47.184 "r_mbytes_per_sec": 0, 00:16:47.185 "w_mbytes_per_sec": 0 00:16:47.185 }, 00:16:47.185 "claimed": false, 00:16:47.185 "zoned": false, 00:16:47.185 "supported_io_types": { 00:16:47.185 "read": true, 00:16:47.185 "write": true, 00:16:47.185 "unmap": true, 00:16:47.185 "write_zeroes": true, 00:16:47.185 "flush": true, 00:16:47.185 "reset": true, 00:16:47.185 "compare": false, 00:16:47.185 "compare_and_write": false, 00:16:47.185 "abort": true, 00:16:47.185 "nvme_admin": false, 00:16:47.185 "nvme_io": false 00:16:47.185 }, 00:16:47.185 "memory_domains": [ 00:16:47.185 { 00:16:47.185 "dma_device_id": "system", 00:16:47.185 "dma_device_type": 1 00:16:47.185 }, 00:16:47.185 { 00:16:47.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.185 "dma_device_type": 2 00:16:47.185 } 00:16:47.185 ], 00:16:47.185 "driver_specific": {} 00:16:47.185 } 00:16:47.185 ] 00:16:47.185 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:47.185 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:47.185 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:47.185 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:47.444 [2024-05-14 11:53:14.282882] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:47.444 [2024-05-14 11:53:14.282925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:47.444 [2024-05-14 11:53:14.282944] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:47.444 [2024-05-14 11:53:14.284609] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:47.444 [2024-05-14 11:53:14.284654] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.444 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.703 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:47.703 "name": "Existed_Raid", 00:16:47.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.703 "strip_size_kb": 64, 00:16:47.703 "state": "configuring", 00:16:47.703 "raid_level": "concat", 00:16:47.703 "superblock": false, 00:16:47.703 "num_base_bdevs": 4, 00:16:47.703 "num_base_bdevs_discovered": 3, 00:16:47.703 "num_base_bdevs_operational": 4, 00:16:47.703 "base_bdevs_list": [ 00:16:47.703 { 00:16:47.703 "name": "BaseBdev1", 00:16:47.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.703 "is_configured": false, 00:16:47.703 "data_offset": 0, 00:16:47.703 "data_size": 0 00:16:47.703 }, 00:16:47.703 { 00:16:47.703 "name": "BaseBdev2", 00:16:47.703 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:47.703 "is_configured": true, 00:16:47.703 "data_offset": 0, 00:16:47.703 "data_size": 65536 00:16:47.703 }, 00:16:47.703 { 00:16:47.703 "name": "BaseBdev3", 00:16:47.703 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:47.703 "is_configured": true, 00:16:47.703 "data_offset": 0, 00:16:47.703 "data_size": 65536 00:16:47.703 }, 00:16:47.703 { 00:16:47.703 "name": "BaseBdev4", 00:16:47.703 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:47.703 "is_configured": true, 00:16:47.703 "data_offset": 0, 00:16:47.703 "data_size": 65536 00:16:47.703 } 00:16:47.703 ] 00:16:47.703 }' 00:16:47.703 11:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:47.703 11:53:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.270 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:48.529 [2024-05-14 11:53:15.357713] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:48.529 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:48.529 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:48.529 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:48.529 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:48.529 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.530 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.789 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:48.789 "name": "Existed_Raid", 00:16:48.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.789 "strip_size_kb": 64, 00:16:48.789 "state": "configuring", 00:16:48.789 "raid_level": "concat", 00:16:48.789 "superblock": false, 00:16:48.789 "num_base_bdevs": 4, 00:16:48.789 "num_base_bdevs_discovered": 2, 00:16:48.789 "num_base_bdevs_operational": 4, 00:16:48.789 "base_bdevs_list": [ 00:16:48.789 { 00:16:48.789 "name": "BaseBdev1", 00:16:48.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.789 "is_configured": false, 00:16:48.789 "data_offset": 0, 00:16:48.789 "data_size": 0 00:16:48.789 }, 00:16:48.789 { 00:16:48.789 "name": null, 00:16:48.789 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:48.789 "is_configured": false, 00:16:48.789 "data_offset": 0, 00:16:48.789 "data_size": 65536 00:16:48.789 }, 00:16:48.789 { 00:16:48.789 "name": "BaseBdev3", 00:16:48.789 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:48.789 "is_configured": true, 00:16:48.789 "data_offset": 0, 00:16:48.789 "data_size": 65536 00:16:48.789 }, 00:16:48.789 { 00:16:48.789 "name": "BaseBdev4", 00:16:48.789 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:48.789 "is_configured": true, 00:16:48.789 "data_offset": 0, 00:16:48.789 "data_size": 65536 00:16:48.789 } 00:16:48.789 ] 00:16:48.789 }' 00:16:48.789 11:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:48.789 11:53:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.358 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.359 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:49.618 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:49.618 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:49.878 [2024-05-14 11:53:16.730075] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:49.878 BaseBdev1 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:49.878 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:50.137 11:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:50.137 [ 00:16:50.137 { 00:16:50.137 "name": "BaseBdev1", 00:16:50.137 "aliases": [ 00:16:50.137 "d4f2d1d1-bec1-4302-8311-1c403f80b0bb" 00:16:50.137 ], 00:16:50.137 "product_name": "Malloc disk", 00:16:50.137 "block_size": 512, 00:16:50.137 "num_blocks": 65536, 00:16:50.137 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:50.137 "assigned_rate_limits": { 00:16:50.137 "rw_ios_per_sec": 0, 00:16:50.137 "rw_mbytes_per_sec": 0, 00:16:50.137 "r_mbytes_per_sec": 0, 00:16:50.137 "w_mbytes_per_sec": 0 00:16:50.137 }, 00:16:50.137 "claimed": true, 00:16:50.137 "claim_type": "exclusive_write", 00:16:50.137 "zoned": false, 00:16:50.137 "supported_io_types": { 00:16:50.137 "read": true, 00:16:50.137 "write": true, 00:16:50.137 "unmap": true, 00:16:50.137 "write_zeroes": true, 00:16:50.137 "flush": true, 00:16:50.137 "reset": true, 00:16:50.137 "compare": false, 00:16:50.137 "compare_and_write": false, 00:16:50.137 "abort": true, 00:16:50.137 "nvme_admin": false, 00:16:50.137 "nvme_io": false 00:16:50.137 }, 00:16:50.137 "memory_domains": [ 00:16:50.137 { 00:16:50.137 "dma_device_id": "system", 00:16:50.137 "dma_device_type": 1 00:16:50.137 }, 00:16:50.137 { 00:16:50.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.137 "dma_device_type": 2 00:16:50.137 } 00:16:50.137 ], 00:16:50.137 "driver_specific": {} 00:16:50.137 } 00:16:50.137 ] 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:50.137 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:50.398 "name": "Existed_Raid", 00:16:50.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.398 "strip_size_kb": 64, 00:16:50.398 "state": "configuring", 00:16:50.398 "raid_level": "concat", 00:16:50.398 "superblock": false, 00:16:50.398 "num_base_bdevs": 4, 00:16:50.398 "num_base_bdevs_discovered": 3, 00:16:50.398 "num_base_bdevs_operational": 4, 00:16:50.398 "base_bdevs_list": [ 00:16:50.398 { 00:16:50.398 "name": "BaseBdev1", 00:16:50.398 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:50.398 "is_configured": true, 00:16:50.398 "data_offset": 0, 00:16:50.398 "data_size": 65536 00:16:50.398 }, 00:16:50.398 { 00:16:50.398 "name": null, 00:16:50.398 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:50.398 "is_configured": false, 00:16:50.398 "data_offset": 0, 00:16:50.398 "data_size": 65536 00:16:50.398 }, 00:16:50.398 { 00:16:50.398 "name": "BaseBdev3", 00:16:50.398 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:50.398 "is_configured": true, 00:16:50.398 "data_offset": 0, 00:16:50.398 "data_size": 65536 00:16:50.398 }, 00:16:50.398 { 00:16:50.398 "name": "BaseBdev4", 00:16:50.398 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:50.398 "is_configured": true, 00:16:50.398 "data_offset": 0, 00:16:50.398 "data_size": 65536 00:16:50.398 } 00:16:50.398 ] 00:16:50.398 }' 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:50.398 11:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.969 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.969 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:51.228 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:51.228 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:51.487 [2024-05-14 11:53:18.526879] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.487 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.746 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:51.746 "name": "Existed_Raid", 00:16:51.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.746 "strip_size_kb": 64, 00:16:51.746 "state": "configuring", 00:16:51.746 "raid_level": "concat", 00:16:51.746 "superblock": false, 00:16:51.746 "num_base_bdevs": 4, 00:16:51.746 "num_base_bdevs_discovered": 2, 00:16:51.746 "num_base_bdevs_operational": 4, 00:16:51.746 "base_bdevs_list": [ 00:16:51.746 { 00:16:51.746 "name": "BaseBdev1", 00:16:51.746 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:51.746 "is_configured": true, 00:16:51.746 "data_offset": 0, 00:16:51.746 "data_size": 65536 00:16:51.746 }, 00:16:51.746 { 00:16:51.746 "name": null, 00:16:51.746 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:51.746 "is_configured": false, 00:16:51.746 "data_offset": 0, 00:16:51.746 "data_size": 65536 00:16:51.746 }, 00:16:51.746 { 00:16:51.746 "name": null, 00:16:51.746 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:51.746 "is_configured": false, 00:16:51.746 "data_offset": 0, 00:16:51.746 "data_size": 65536 00:16:51.746 }, 00:16:51.746 { 00:16:51.746 "name": "BaseBdev4", 00:16:51.746 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:51.746 "is_configured": true, 00:16:51.746 "data_offset": 0, 00:16:51.746 "data_size": 65536 00:16:51.746 } 00:16:51.746 ] 00:16:51.746 }' 00:16:51.746 11:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:51.746 11:53:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.682 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.682 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:52.682 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:52.682 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:52.941 [2024-05-14 11:53:19.878478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:52.941 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.942 11:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.201 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:53.201 "name": "Existed_Raid", 00:16:53.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.201 "strip_size_kb": 64, 00:16:53.201 "state": "configuring", 00:16:53.201 "raid_level": "concat", 00:16:53.201 "superblock": false, 00:16:53.201 "num_base_bdevs": 4, 00:16:53.201 "num_base_bdevs_discovered": 3, 00:16:53.201 "num_base_bdevs_operational": 4, 00:16:53.201 "base_bdevs_list": [ 00:16:53.201 { 00:16:53.201 "name": "BaseBdev1", 00:16:53.201 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:53.201 "is_configured": true, 00:16:53.201 "data_offset": 0, 00:16:53.201 "data_size": 65536 00:16:53.201 }, 00:16:53.201 { 00:16:53.201 "name": null, 00:16:53.201 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:53.201 "is_configured": false, 00:16:53.201 "data_offset": 0, 00:16:53.201 "data_size": 65536 00:16:53.201 }, 00:16:53.201 { 00:16:53.201 "name": "BaseBdev3", 00:16:53.201 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:53.201 "is_configured": true, 00:16:53.201 "data_offset": 0, 00:16:53.201 "data_size": 65536 00:16:53.201 }, 00:16:53.201 { 00:16:53.201 "name": "BaseBdev4", 00:16:53.201 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:53.201 "is_configured": true, 00:16:53.201 "data_offset": 0, 00:16:53.201 "data_size": 65536 00:16:53.201 } 00:16:53.201 ] 00:16:53.201 }' 00:16:53.201 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:53.201 11:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.769 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.769 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:54.028 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:54.028 11:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:54.287 [2024-05-14 11:53:21.222033] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.287 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.546 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:54.546 "name": "Existed_Raid", 00:16:54.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.546 "strip_size_kb": 64, 00:16:54.546 "state": "configuring", 00:16:54.546 "raid_level": "concat", 00:16:54.546 "superblock": false, 00:16:54.546 "num_base_bdevs": 4, 00:16:54.546 "num_base_bdevs_discovered": 2, 00:16:54.546 "num_base_bdevs_operational": 4, 00:16:54.546 "base_bdevs_list": [ 00:16:54.546 { 00:16:54.546 "name": null, 00:16:54.546 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:54.546 "is_configured": false, 00:16:54.546 "data_offset": 0, 00:16:54.546 "data_size": 65536 00:16:54.546 }, 00:16:54.546 { 00:16:54.546 "name": null, 00:16:54.546 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:54.546 "is_configured": false, 00:16:54.546 "data_offset": 0, 00:16:54.546 "data_size": 65536 00:16:54.546 }, 00:16:54.546 { 00:16:54.546 "name": "BaseBdev3", 00:16:54.546 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:54.546 "is_configured": true, 00:16:54.546 "data_offset": 0, 00:16:54.546 "data_size": 65536 00:16:54.546 }, 00:16:54.546 { 00:16:54.546 "name": "BaseBdev4", 00:16:54.546 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:54.546 "is_configured": true, 00:16:54.546 "data_offset": 0, 00:16:54.546 "data_size": 65536 00:16:54.546 } 00:16:54.546 ] 00:16:54.546 }' 00:16:54.546 11:53:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:54.546 11:53:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.114 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.114 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:55.374 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:55.374 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:55.633 [2024-05-14 11:53:22.593098] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:55.633 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:55.633 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:55.633 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:55.633 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.634 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:55.892 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:55.892 "name": "Existed_Raid", 00:16:55.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.892 "strip_size_kb": 64, 00:16:55.892 "state": "configuring", 00:16:55.892 "raid_level": "concat", 00:16:55.892 "superblock": false, 00:16:55.892 "num_base_bdevs": 4, 00:16:55.892 "num_base_bdevs_discovered": 3, 00:16:55.892 "num_base_bdevs_operational": 4, 00:16:55.892 "base_bdevs_list": [ 00:16:55.892 { 00:16:55.892 "name": null, 00:16:55.892 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:55.892 "is_configured": false, 00:16:55.892 "data_offset": 0, 00:16:55.892 "data_size": 65536 00:16:55.892 }, 00:16:55.892 { 00:16:55.892 "name": "BaseBdev2", 00:16:55.892 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:55.892 "is_configured": true, 00:16:55.892 "data_offset": 0, 00:16:55.892 "data_size": 65536 00:16:55.892 }, 00:16:55.892 { 00:16:55.892 "name": "BaseBdev3", 00:16:55.892 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:55.892 "is_configured": true, 00:16:55.892 "data_offset": 0, 00:16:55.892 "data_size": 65536 00:16:55.892 }, 00:16:55.892 { 00:16:55.892 "name": "BaseBdev4", 00:16:55.892 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:55.892 "is_configured": true, 00:16:55.892 "data_offset": 0, 00:16:55.892 "data_size": 65536 00:16:55.892 } 00:16:55.892 ] 00:16:55.892 }' 00:16:55.892 11:53:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:55.892 11:53:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.460 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.460 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:56.719 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:56.719 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.719 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:56.977 11:53:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d4f2d1d1-bec1-4302-8311-1c403f80b0bb 00:16:57.236 [2024-05-14 11:53:24.199508] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:57.236 [2024-05-14 11:53:24.199546] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xd71060 00:16:57.236 [2024-05-14 11:53:24.199554] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:57.236 [2024-05-14 11:53:24.199749] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd70aa0 00:16:57.236 [2024-05-14 11:53:24.199876] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd71060 00:16:57.236 [2024-05-14 11:53:24.199886] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd71060 00:16:57.236 [2024-05-14 11:53:24.200060] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:57.236 NewBaseBdev 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:57.236 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:57.494 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:57.753 [ 00:16:57.753 { 00:16:57.753 "name": "NewBaseBdev", 00:16:57.753 "aliases": [ 00:16:57.753 "d4f2d1d1-bec1-4302-8311-1c403f80b0bb" 00:16:57.753 ], 00:16:57.753 "product_name": "Malloc disk", 00:16:57.753 "block_size": 512, 00:16:57.753 "num_blocks": 65536, 00:16:57.753 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:57.753 "assigned_rate_limits": { 00:16:57.753 "rw_ios_per_sec": 0, 00:16:57.753 "rw_mbytes_per_sec": 0, 00:16:57.753 "r_mbytes_per_sec": 0, 00:16:57.753 "w_mbytes_per_sec": 0 00:16:57.753 }, 00:16:57.753 "claimed": true, 00:16:57.753 "claim_type": "exclusive_write", 00:16:57.753 "zoned": false, 00:16:57.753 "supported_io_types": { 00:16:57.753 "read": true, 00:16:57.753 "write": true, 00:16:57.753 "unmap": true, 00:16:57.753 "write_zeroes": true, 00:16:57.753 "flush": true, 00:16:57.753 "reset": true, 00:16:57.753 "compare": false, 00:16:57.753 "compare_and_write": false, 00:16:57.753 "abort": true, 00:16:57.753 "nvme_admin": false, 00:16:57.753 "nvme_io": false 00:16:57.753 }, 00:16:57.753 "memory_domains": [ 00:16:57.753 { 00:16:57.753 "dma_device_id": "system", 00:16:57.754 "dma_device_type": 1 00:16:57.754 }, 00:16:57.754 { 00:16:57.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.754 "dma_device_type": 2 00:16:57.754 } 00:16:57.754 ], 00:16:57.754 "driver_specific": {} 00:16:57.754 } 00:16:57.754 ] 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.754 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.013 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:58.013 "name": "Existed_Raid", 00:16:58.013 "uuid": "f91aec80-71e5-41ef-b4c8-e3fd97503b7e", 00:16:58.013 "strip_size_kb": 64, 00:16:58.013 "state": "online", 00:16:58.013 "raid_level": "concat", 00:16:58.013 "superblock": false, 00:16:58.013 "num_base_bdevs": 4, 00:16:58.013 "num_base_bdevs_discovered": 4, 00:16:58.013 "num_base_bdevs_operational": 4, 00:16:58.013 "base_bdevs_list": [ 00:16:58.013 { 00:16:58.013 "name": "NewBaseBdev", 00:16:58.013 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:58.013 "is_configured": true, 00:16:58.013 "data_offset": 0, 00:16:58.013 "data_size": 65536 00:16:58.013 }, 00:16:58.013 { 00:16:58.013 "name": "BaseBdev2", 00:16:58.013 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:58.013 "is_configured": true, 00:16:58.013 "data_offset": 0, 00:16:58.013 "data_size": 65536 00:16:58.013 }, 00:16:58.013 { 00:16:58.013 "name": "BaseBdev3", 00:16:58.013 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:58.013 "is_configured": true, 00:16:58.013 "data_offset": 0, 00:16:58.013 "data_size": 65536 00:16:58.013 }, 00:16:58.013 { 00:16:58.013 "name": "BaseBdev4", 00:16:58.013 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:58.013 "is_configured": true, 00:16:58.013 "data_offset": 0, 00:16:58.013 "data_size": 65536 00:16:58.013 } 00:16:58.013 ] 00:16:58.013 }' 00:16:58.013 11:53:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:58.013 11:53:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:58.583 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:58.893 [2024-05-14 11:53:25.759946] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:58.893 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:58.893 "name": "Existed_Raid", 00:16:58.893 "aliases": [ 00:16:58.893 "f91aec80-71e5-41ef-b4c8-e3fd97503b7e" 00:16:58.893 ], 00:16:58.893 "product_name": "Raid Volume", 00:16:58.893 "block_size": 512, 00:16:58.893 "num_blocks": 262144, 00:16:58.893 "uuid": "f91aec80-71e5-41ef-b4c8-e3fd97503b7e", 00:16:58.893 "assigned_rate_limits": { 00:16:58.893 "rw_ios_per_sec": 0, 00:16:58.893 "rw_mbytes_per_sec": 0, 00:16:58.893 "r_mbytes_per_sec": 0, 00:16:58.893 "w_mbytes_per_sec": 0 00:16:58.893 }, 00:16:58.893 "claimed": false, 00:16:58.893 "zoned": false, 00:16:58.893 "supported_io_types": { 00:16:58.893 "read": true, 00:16:58.893 "write": true, 00:16:58.893 "unmap": true, 00:16:58.893 "write_zeroes": true, 00:16:58.893 "flush": true, 00:16:58.893 "reset": true, 00:16:58.893 "compare": false, 00:16:58.893 "compare_and_write": false, 00:16:58.893 "abort": false, 00:16:58.893 "nvme_admin": false, 00:16:58.893 "nvme_io": false 00:16:58.893 }, 00:16:58.893 "memory_domains": [ 00:16:58.893 { 00:16:58.893 "dma_device_id": "system", 00:16:58.893 "dma_device_type": 1 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.893 "dma_device_type": 2 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "system", 00:16:58.893 "dma_device_type": 1 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.893 "dma_device_type": 2 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "system", 00:16:58.893 "dma_device_type": 1 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.893 "dma_device_type": 2 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "system", 00:16:58.893 "dma_device_type": 1 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.893 "dma_device_type": 2 00:16:58.893 } 00:16:58.893 ], 00:16:58.893 "driver_specific": { 00:16:58.893 "raid": { 00:16:58.893 "uuid": "f91aec80-71e5-41ef-b4c8-e3fd97503b7e", 00:16:58.893 "strip_size_kb": 64, 00:16:58.893 "state": "online", 00:16:58.893 "raid_level": "concat", 00:16:58.893 "superblock": false, 00:16:58.893 "num_base_bdevs": 4, 00:16:58.893 "num_base_bdevs_discovered": 4, 00:16:58.893 "num_base_bdevs_operational": 4, 00:16:58.893 "base_bdevs_list": [ 00:16:58.893 { 00:16:58.893 "name": "NewBaseBdev", 00:16:58.893 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:58.893 "is_configured": true, 00:16:58.893 "data_offset": 0, 00:16:58.893 "data_size": 65536 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "name": "BaseBdev2", 00:16:58.893 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:58.893 "is_configured": true, 00:16:58.893 "data_offset": 0, 00:16:58.893 "data_size": 65536 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "name": "BaseBdev3", 00:16:58.893 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:16:58.893 "is_configured": true, 00:16:58.893 "data_offset": 0, 00:16:58.893 "data_size": 65536 00:16:58.893 }, 00:16:58.893 { 00:16:58.893 "name": "BaseBdev4", 00:16:58.893 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:16:58.893 "is_configured": true, 00:16:58.893 "data_offset": 0, 00:16:58.893 "data_size": 65536 00:16:58.893 } 00:16:58.893 ] 00:16:58.893 } 00:16:58.893 } 00:16:58.893 }' 00:16:58.894 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:58.894 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:16:58.894 BaseBdev2 00:16:58.894 BaseBdev3 00:16:58.894 BaseBdev4' 00:16:58.894 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:58.894 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:58.894 11:53:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:59.156 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:59.156 "name": "NewBaseBdev", 00:16:59.156 "aliases": [ 00:16:59.156 "d4f2d1d1-bec1-4302-8311-1c403f80b0bb" 00:16:59.156 ], 00:16:59.156 "product_name": "Malloc disk", 00:16:59.156 "block_size": 512, 00:16:59.156 "num_blocks": 65536, 00:16:59.156 "uuid": "d4f2d1d1-bec1-4302-8311-1c403f80b0bb", 00:16:59.157 "assigned_rate_limits": { 00:16:59.157 "rw_ios_per_sec": 0, 00:16:59.157 "rw_mbytes_per_sec": 0, 00:16:59.157 "r_mbytes_per_sec": 0, 00:16:59.157 "w_mbytes_per_sec": 0 00:16:59.157 }, 00:16:59.157 "claimed": true, 00:16:59.157 "claim_type": "exclusive_write", 00:16:59.157 "zoned": false, 00:16:59.157 "supported_io_types": { 00:16:59.157 "read": true, 00:16:59.157 "write": true, 00:16:59.157 "unmap": true, 00:16:59.157 "write_zeroes": true, 00:16:59.157 "flush": true, 00:16:59.157 "reset": true, 00:16:59.157 "compare": false, 00:16:59.157 "compare_and_write": false, 00:16:59.157 "abort": true, 00:16:59.157 "nvme_admin": false, 00:16:59.157 "nvme_io": false 00:16:59.157 }, 00:16:59.157 "memory_domains": [ 00:16:59.157 { 00:16:59.157 "dma_device_id": "system", 00:16:59.157 "dma_device_type": 1 00:16:59.157 }, 00:16:59.157 { 00:16:59.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.157 "dma_device_type": 2 00:16:59.157 } 00:16:59.157 ], 00:16:59.157 "driver_specific": {} 00:16:59.157 }' 00:16:59.157 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.157 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.157 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:59.157 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.157 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:59.416 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:59.675 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:59.675 "name": "BaseBdev2", 00:16:59.675 "aliases": [ 00:16:59.675 "c4795920-6b92-44b1-adc8-9ccaa04b7121" 00:16:59.675 ], 00:16:59.675 "product_name": "Malloc disk", 00:16:59.675 "block_size": 512, 00:16:59.675 "num_blocks": 65536, 00:16:59.675 "uuid": "c4795920-6b92-44b1-adc8-9ccaa04b7121", 00:16:59.675 "assigned_rate_limits": { 00:16:59.675 "rw_ios_per_sec": 0, 00:16:59.675 "rw_mbytes_per_sec": 0, 00:16:59.675 "r_mbytes_per_sec": 0, 00:16:59.675 "w_mbytes_per_sec": 0 00:16:59.675 }, 00:16:59.675 "claimed": true, 00:16:59.675 "claim_type": "exclusive_write", 00:16:59.675 "zoned": false, 00:16:59.675 "supported_io_types": { 00:16:59.675 "read": true, 00:16:59.675 "write": true, 00:16:59.675 "unmap": true, 00:16:59.675 "write_zeroes": true, 00:16:59.675 "flush": true, 00:16:59.675 "reset": true, 00:16:59.675 "compare": false, 00:16:59.675 "compare_and_write": false, 00:16:59.675 "abort": true, 00:16:59.675 "nvme_admin": false, 00:16:59.675 "nvme_io": false 00:16:59.675 }, 00:16:59.675 "memory_domains": [ 00:16:59.675 { 00:16:59.676 "dma_device_id": "system", 00:16:59.676 "dma_device_type": 1 00:16:59.676 }, 00:16:59.676 { 00:16:59.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.676 "dma_device_type": 2 00:16:59.676 } 00:16:59.676 ], 00:16:59.676 "driver_specific": {} 00:16:59.676 }' 00:16:59.676 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.676 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:59.676 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:59.935 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.194 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:00.194 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.194 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:00.194 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:00.194 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:00.194 "name": "BaseBdev3", 00:17:00.194 "aliases": [ 00:17:00.194 "b6ecef25-8b81-4637-9b6c-1b95a91c1cce" 00:17:00.194 ], 00:17:00.194 "product_name": "Malloc disk", 00:17:00.195 "block_size": 512, 00:17:00.195 "num_blocks": 65536, 00:17:00.195 "uuid": "b6ecef25-8b81-4637-9b6c-1b95a91c1cce", 00:17:00.195 "assigned_rate_limits": { 00:17:00.195 "rw_ios_per_sec": 0, 00:17:00.195 "rw_mbytes_per_sec": 0, 00:17:00.195 "r_mbytes_per_sec": 0, 00:17:00.195 "w_mbytes_per_sec": 0 00:17:00.195 }, 00:17:00.195 "claimed": true, 00:17:00.195 "claim_type": "exclusive_write", 00:17:00.195 "zoned": false, 00:17:00.195 "supported_io_types": { 00:17:00.195 "read": true, 00:17:00.195 "write": true, 00:17:00.195 "unmap": true, 00:17:00.195 "write_zeroes": true, 00:17:00.195 "flush": true, 00:17:00.195 "reset": true, 00:17:00.195 "compare": false, 00:17:00.195 "compare_and_write": false, 00:17:00.195 "abort": true, 00:17:00.195 "nvme_admin": false, 00:17:00.195 "nvme_io": false 00:17:00.195 }, 00:17:00.195 "memory_domains": [ 00:17:00.195 { 00:17:00.195 "dma_device_id": "system", 00:17:00.195 "dma_device_type": 1 00:17:00.195 }, 00:17:00.195 { 00:17:00.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.195 "dma_device_type": 2 00:17:00.195 } 00:17:00.195 ], 00:17:00.195 "driver_specific": {} 00:17:00.195 }' 00:17:00.195 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.454 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.712 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.712 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:00.712 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.712 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:00.712 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:00.971 "name": "BaseBdev4", 00:17:00.971 "aliases": [ 00:17:00.971 "2dcaf410-10f5-41f7-b946-ab70cabd6dab" 00:17:00.971 ], 00:17:00.971 "product_name": "Malloc disk", 00:17:00.971 "block_size": 512, 00:17:00.971 "num_blocks": 65536, 00:17:00.971 "uuid": "2dcaf410-10f5-41f7-b946-ab70cabd6dab", 00:17:00.971 "assigned_rate_limits": { 00:17:00.971 "rw_ios_per_sec": 0, 00:17:00.971 "rw_mbytes_per_sec": 0, 00:17:00.971 "r_mbytes_per_sec": 0, 00:17:00.971 "w_mbytes_per_sec": 0 00:17:00.971 }, 00:17:00.971 "claimed": true, 00:17:00.971 "claim_type": "exclusive_write", 00:17:00.971 "zoned": false, 00:17:00.971 "supported_io_types": { 00:17:00.971 "read": true, 00:17:00.971 "write": true, 00:17:00.971 "unmap": true, 00:17:00.971 "write_zeroes": true, 00:17:00.971 "flush": true, 00:17:00.971 "reset": true, 00:17:00.971 "compare": false, 00:17:00.971 "compare_and_write": false, 00:17:00.971 "abort": true, 00:17:00.971 "nvme_admin": false, 00:17:00.971 "nvme_io": false 00:17:00.971 }, 00:17:00.971 "memory_domains": [ 00:17:00.971 { 00:17:00.971 "dma_device_id": "system", 00:17:00.971 "dma_device_type": 1 00:17:00.971 }, 00:17:00.971 { 00:17:00.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.971 "dma_device_type": 2 00:17:00.971 } 00:17:00.971 ], 00:17:00.971 "driver_specific": {} 00:17:00.971 }' 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.971 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.971 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.971 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:01.230 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:01.490 [2024-05-14 11:53:28.426755] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:01.490 [2024-05-14 11:53:28.426779] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.490 [2024-05-14 11:53:28.426829] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.490 [2024-05-14 11:53:28.426888] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.490 [2024-05-14 11:53:28.426900] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd71060 name Existed_Raid, state offline 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1724342 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1724342 ']' 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1724342 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1724342 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1724342' 00:17:01.490 killing process with pid 1724342 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1724342 00:17:01.490 [2024-05-14 11:53:28.506167] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:01.490 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1724342 00:17:01.750 [2024-05-14 11:53:28.581359] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:02.009 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:17:02.009 00:17:02.009 real 0m32.516s 00:17:02.009 user 0m59.420s 00:17:02.009 sys 0m5.743s 00:17:02.009 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:02.009 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.009 ************************************ 00:17:02.009 END TEST raid_state_function_test 00:17:02.009 ************************************ 00:17:02.009 11:53:29 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:17:02.009 11:53:29 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:17:02.009 11:53:29 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:02.009 11:53:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:02.009 ************************************ 00:17:02.009 START TEST raid_state_function_test_sb 00:17:02.009 ************************************ 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 true 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:17:02.009 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1729235 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1729235' 00:17:02.010 Process raid pid: 1729235 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1729235 /var/tmp/spdk-raid.sock 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1729235 ']' 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:02.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:02.010 11:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:02.269 [2024-05-14 11:53:29.145928] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:17:02.269 [2024-05-14 11:53:29.145997] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:02.269 [2024-05-14 11:53:29.276517] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:02.528 [2024-05-14 11:53:29.381908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:02.528 [2024-05-14 11:53:29.448834] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:02.528 [2024-05-14 11:53:29.448863] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:03.095 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:03.095 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:17:03.095 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:03.354 [2024-05-14 11:53:30.299874] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:03.354 [2024-05-14 11:53:30.299922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:03.354 [2024-05-14 11:53:30.299933] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:03.354 [2024-05-14 11:53:30.299945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:03.354 [2024-05-14 11:53:30.299954] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:03.354 [2024-05-14 11:53:30.299966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:03.354 [2024-05-14 11:53:30.299975] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:03.354 [2024-05-14 11:53:30.299986] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.354 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.613 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:03.613 "name": "Existed_Raid", 00:17:03.613 "uuid": "a3cd7acc-d87a-4e80-a543-09269dbfe282", 00:17:03.613 "strip_size_kb": 64, 00:17:03.613 "state": "configuring", 00:17:03.613 "raid_level": "concat", 00:17:03.613 "superblock": true, 00:17:03.613 "num_base_bdevs": 4, 00:17:03.613 "num_base_bdevs_discovered": 0, 00:17:03.613 "num_base_bdevs_operational": 4, 00:17:03.613 "base_bdevs_list": [ 00:17:03.613 { 00:17:03.613 "name": "BaseBdev1", 00:17:03.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.613 "is_configured": false, 00:17:03.613 "data_offset": 0, 00:17:03.613 "data_size": 0 00:17:03.613 }, 00:17:03.613 { 00:17:03.613 "name": "BaseBdev2", 00:17:03.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.613 "is_configured": false, 00:17:03.613 "data_offset": 0, 00:17:03.613 "data_size": 0 00:17:03.613 }, 00:17:03.613 { 00:17:03.613 "name": "BaseBdev3", 00:17:03.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.613 "is_configured": false, 00:17:03.613 "data_offset": 0, 00:17:03.613 "data_size": 0 00:17:03.613 }, 00:17:03.613 { 00:17:03.613 "name": "BaseBdev4", 00:17:03.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.613 "is_configured": false, 00:17:03.613 "data_offset": 0, 00:17:03.613 "data_size": 0 00:17:03.613 } 00:17:03.613 ] 00:17:03.613 }' 00:17:03.613 11:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:03.613 11:53:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:04.181 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:04.439 [2024-05-14 11:53:31.378568] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:04.439 [2024-05-14 11:53:31.378602] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2593720 name Existed_Raid, state configuring 00:17:04.439 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:04.698 [2024-05-14 11:53:31.623244] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:04.698 [2024-05-14 11:53:31.623273] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:04.698 [2024-05-14 11:53:31.623283] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:04.698 [2024-05-14 11:53:31.623296] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:04.698 [2024-05-14 11:53:31.623305] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:04.698 [2024-05-14 11:53:31.623316] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:04.698 [2024-05-14 11:53:31.623325] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:04.698 [2024-05-14 11:53:31.623336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:04.698 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:04.957 [2024-05-14 11:53:31.869665] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:04.957 BaseBdev1 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:04.957 11:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:05.215 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:05.473 [ 00:17:05.473 { 00:17:05.473 "name": "BaseBdev1", 00:17:05.473 "aliases": [ 00:17:05.473 "c6204f20-1a3e-4506-8ac0-84914f41d005" 00:17:05.473 ], 00:17:05.473 "product_name": "Malloc disk", 00:17:05.473 "block_size": 512, 00:17:05.473 "num_blocks": 65536, 00:17:05.473 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:05.473 "assigned_rate_limits": { 00:17:05.473 "rw_ios_per_sec": 0, 00:17:05.473 "rw_mbytes_per_sec": 0, 00:17:05.473 "r_mbytes_per_sec": 0, 00:17:05.473 "w_mbytes_per_sec": 0 00:17:05.473 }, 00:17:05.473 "claimed": true, 00:17:05.473 "claim_type": "exclusive_write", 00:17:05.473 "zoned": false, 00:17:05.473 "supported_io_types": { 00:17:05.473 "read": true, 00:17:05.473 "write": true, 00:17:05.473 "unmap": true, 00:17:05.473 "write_zeroes": true, 00:17:05.473 "flush": true, 00:17:05.473 "reset": true, 00:17:05.473 "compare": false, 00:17:05.473 "compare_and_write": false, 00:17:05.473 "abort": true, 00:17:05.473 "nvme_admin": false, 00:17:05.473 "nvme_io": false 00:17:05.473 }, 00:17:05.473 "memory_domains": [ 00:17:05.473 { 00:17:05.473 "dma_device_id": "system", 00:17:05.473 "dma_device_type": 1 00:17:05.473 }, 00:17:05.473 { 00:17:05.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.473 "dma_device_type": 2 00:17:05.473 } 00:17:05.473 ], 00:17:05.473 "driver_specific": {} 00:17:05.473 } 00:17:05.473 ] 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.473 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.731 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:05.731 "name": "Existed_Raid", 00:17:05.731 "uuid": "113ef2c3-bb10-42ac-8d98-ca77cb12e909", 00:17:05.731 "strip_size_kb": 64, 00:17:05.731 "state": "configuring", 00:17:05.731 "raid_level": "concat", 00:17:05.731 "superblock": true, 00:17:05.731 "num_base_bdevs": 4, 00:17:05.731 "num_base_bdevs_discovered": 1, 00:17:05.731 "num_base_bdevs_operational": 4, 00:17:05.731 "base_bdevs_list": [ 00:17:05.731 { 00:17:05.731 "name": "BaseBdev1", 00:17:05.731 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:05.731 "is_configured": true, 00:17:05.731 "data_offset": 2048, 00:17:05.731 "data_size": 63488 00:17:05.731 }, 00:17:05.731 { 00:17:05.731 "name": "BaseBdev2", 00:17:05.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.731 "is_configured": false, 00:17:05.731 "data_offset": 0, 00:17:05.731 "data_size": 0 00:17:05.731 }, 00:17:05.731 { 00:17:05.731 "name": "BaseBdev3", 00:17:05.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.731 "is_configured": false, 00:17:05.731 "data_offset": 0, 00:17:05.731 "data_size": 0 00:17:05.731 }, 00:17:05.731 { 00:17:05.731 "name": "BaseBdev4", 00:17:05.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.731 "is_configured": false, 00:17:05.731 "data_offset": 0, 00:17:05.731 "data_size": 0 00:17:05.731 } 00:17:05.731 ] 00:17:05.731 }' 00:17:05.731 11:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:05.731 11:53:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:06.298 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:06.557 [2024-05-14 11:53:33.445824] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:06.557 [2024-05-14 11:53:33.445865] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2592fb0 name Existed_Raid, state configuring 00:17:06.557 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:06.815 [2024-05-14 11:53:33.690523] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:06.815 [2024-05-14 11:53:33.692041] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:06.815 [2024-05-14 11:53:33.692072] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:06.815 [2024-05-14 11:53:33.692082] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:06.815 [2024-05-14 11:53:33.692095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:06.815 [2024-05-14 11:53:33.692104] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:06.815 [2024-05-14 11:53:33.692115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:06.815 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.816 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.075 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:07.075 "name": "Existed_Raid", 00:17:07.075 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:07.075 "strip_size_kb": 64, 00:17:07.075 "state": "configuring", 00:17:07.075 "raid_level": "concat", 00:17:07.075 "superblock": true, 00:17:07.075 "num_base_bdevs": 4, 00:17:07.075 "num_base_bdevs_discovered": 1, 00:17:07.075 "num_base_bdevs_operational": 4, 00:17:07.075 "base_bdevs_list": [ 00:17:07.075 { 00:17:07.075 "name": "BaseBdev1", 00:17:07.075 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:07.075 "is_configured": true, 00:17:07.075 "data_offset": 2048, 00:17:07.075 "data_size": 63488 00:17:07.075 }, 00:17:07.075 { 00:17:07.075 "name": "BaseBdev2", 00:17:07.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.075 "is_configured": false, 00:17:07.075 "data_offset": 0, 00:17:07.075 "data_size": 0 00:17:07.075 }, 00:17:07.075 { 00:17:07.075 "name": "BaseBdev3", 00:17:07.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.075 "is_configured": false, 00:17:07.075 "data_offset": 0, 00:17:07.075 "data_size": 0 00:17:07.075 }, 00:17:07.075 { 00:17:07.075 "name": "BaseBdev4", 00:17:07.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.075 "is_configured": false, 00:17:07.075 "data_offset": 0, 00:17:07.075 "data_size": 0 00:17:07.075 } 00:17:07.075 ] 00:17:07.075 }' 00:17:07.075 11:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:07.075 11:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:07.642 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:07.642 [2024-05-14 11:53:34.708721] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:07.642 BaseBdev2 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.900 11:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:08.158 [ 00:17:08.158 { 00:17:08.158 "name": "BaseBdev2", 00:17:08.158 "aliases": [ 00:17:08.158 "9cb75d0d-eed0-4aae-a468-b57e7543c90d" 00:17:08.158 ], 00:17:08.158 "product_name": "Malloc disk", 00:17:08.158 "block_size": 512, 00:17:08.158 "num_blocks": 65536, 00:17:08.158 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:08.158 "assigned_rate_limits": { 00:17:08.158 "rw_ios_per_sec": 0, 00:17:08.158 "rw_mbytes_per_sec": 0, 00:17:08.158 "r_mbytes_per_sec": 0, 00:17:08.158 "w_mbytes_per_sec": 0 00:17:08.158 }, 00:17:08.158 "claimed": true, 00:17:08.158 "claim_type": "exclusive_write", 00:17:08.158 "zoned": false, 00:17:08.158 "supported_io_types": { 00:17:08.158 "read": true, 00:17:08.158 "write": true, 00:17:08.158 "unmap": true, 00:17:08.158 "write_zeroes": true, 00:17:08.158 "flush": true, 00:17:08.158 "reset": true, 00:17:08.158 "compare": false, 00:17:08.158 "compare_and_write": false, 00:17:08.158 "abort": true, 00:17:08.158 "nvme_admin": false, 00:17:08.158 "nvme_io": false 00:17:08.158 }, 00:17:08.158 "memory_domains": [ 00:17:08.158 { 00:17:08.158 "dma_device_id": "system", 00:17:08.158 "dma_device_type": 1 00:17:08.158 }, 00:17:08.158 { 00:17:08.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.158 "dma_device_type": 2 00:17:08.158 } 00:17:08.158 ], 00:17:08.158 "driver_specific": {} 00:17:08.158 } 00:17:08.158 ] 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.158 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.416 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:08.416 "name": "Existed_Raid", 00:17:08.416 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:08.416 "strip_size_kb": 64, 00:17:08.416 "state": "configuring", 00:17:08.416 "raid_level": "concat", 00:17:08.416 "superblock": true, 00:17:08.416 "num_base_bdevs": 4, 00:17:08.416 "num_base_bdevs_discovered": 2, 00:17:08.416 "num_base_bdevs_operational": 4, 00:17:08.416 "base_bdevs_list": [ 00:17:08.416 { 00:17:08.416 "name": "BaseBdev1", 00:17:08.416 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:08.416 "is_configured": true, 00:17:08.416 "data_offset": 2048, 00:17:08.416 "data_size": 63488 00:17:08.416 }, 00:17:08.416 { 00:17:08.416 "name": "BaseBdev2", 00:17:08.416 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:08.416 "is_configured": true, 00:17:08.416 "data_offset": 2048, 00:17:08.416 "data_size": 63488 00:17:08.416 }, 00:17:08.416 { 00:17:08.416 "name": "BaseBdev3", 00:17:08.416 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.416 "is_configured": false, 00:17:08.416 "data_offset": 0, 00:17:08.416 "data_size": 0 00:17:08.416 }, 00:17:08.416 { 00:17:08.416 "name": "BaseBdev4", 00:17:08.416 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.416 "is_configured": false, 00:17:08.416 "data_offset": 0, 00:17:08.416 "data_size": 0 00:17:08.416 } 00:17:08.416 ] 00:17:08.416 }' 00:17:08.416 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:08.416 11:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:08.980 11:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:08.980 [2024-05-14 11:53:36.060093] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:08.980 BaseBdev3 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:09.239 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:09.497 [ 00:17:09.497 { 00:17:09.497 "name": "BaseBdev3", 00:17:09.497 "aliases": [ 00:17:09.497 "e9b34022-2b5c-4919-b3f9-efd72fb392d1" 00:17:09.497 ], 00:17:09.497 "product_name": "Malloc disk", 00:17:09.497 "block_size": 512, 00:17:09.497 "num_blocks": 65536, 00:17:09.497 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:09.497 "assigned_rate_limits": { 00:17:09.497 "rw_ios_per_sec": 0, 00:17:09.497 "rw_mbytes_per_sec": 0, 00:17:09.497 "r_mbytes_per_sec": 0, 00:17:09.497 "w_mbytes_per_sec": 0 00:17:09.497 }, 00:17:09.497 "claimed": true, 00:17:09.497 "claim_type": "exclusive_write", 00:17:09.497 "zoned": false, 00:17:09.497 "supported_io_types": { 00:17:09.497 "read": true, 00:17:09.497 "write": true, 00:17:09.497 "unmap": true, 00:17:09.497 "write_zeroes": true, 00:17:09.497 "flush": true, 00:17:09.497 "reset": true, 00:17:09.497 "compare": false, 00:17:09.497 "compare_and_write": false, 00:17:09.497 "abort": true, 00:17:09.497 "nvme_admin": false, 00:17:09.497 "nvme_io": false 00:17:09.497 }, 00:17:09.497 "memory_domains": [ 00:17:09.497 { 00:17:09.497 "dma_device_id": "system", 00:17:09.497 "dma_device_type": 1 00:17:09.497 }, 00:17:09.497 { 00:17:09.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.497 "dma_device_type": 2 00:17:09.497 } 00:17:09.497 ], 00:17:09.497 "driver_specific": {} 00:17:09.497 } 00:17:09.497 ] 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.497 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.756 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:09.756 "name": "Existed_Raid", 00:17:09.756 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:09.756 "strip_size_kb": 64, 00:17:09.756 "state": "configuring", 00:17:09.756 "raid_level": "concat", 00:17:09.756 "superblock": true, 00:17:09.756 "num_base_bdevs": 4, 00:17:09.756 "num_base_bdevs_discovered": 3, 00:17:09.756 "num_base_bdevs_operational": 4, 00:17:09.756 "base_bdevs_list": [ 00:17:09.756 { 00:17:09.756 "name": "BaseBdev1", 00:17:09.756 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:09.756 "is_configured": true, 00:17:09.756 "data_offset": 2048, 00:17:09.756 "data_size": 63488 00:17:09.756 }, 00:17:09.756 { 00:17:09.756 "name": "BaseBdev2", 00:17:09.756 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:09.756 "is_configured": true, 00:17:09.756 "data_offset": 2048, 00:17:09.756 "data_size": 63488 00:17:09.756 }, 00:17:09.756 { 00:17:09.756 "name": "BaseBdev3", 00:17:09.756 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:09.756 "is_configured": true, 00:17:09.756 "data_offset": 2048, 00:17:09.756 "data_size": 63488 00:17:09.756 }, 00:17:09.756 { 00:17:09.756 "name": "BaseBdev4", 00:17:09.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.756 "is_configured": false, 00:17:09.756 "data_offset": 0, 00:17:09.756 "data_size": 0 00:17:09.756 } 00:17:09.756 ] 00:17:09.756 }' 00:17:09.756 11:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:09.756 11:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.322 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:10.581 [2024-05-14 11:53:37.583585] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:10.581 [2024-05-14 11:53:37.583762] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x25941b0 00:17:10.581 [2024-05-14 11:53:37.583776] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:10.581 [2024-05-14 11:53:37.583961] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2595860 00:17:10.581 [2024-05-14 11:53:37.584090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25941b0 00:17:10.581 [2024-05-14 11:53:37.584099] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25941b0 00:17:10.581 [2024-05-14 11:53:37.584199] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.581 BaseBdev4 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:10.581 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.839 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:11.099 [ 00:17:11.099 { 00:17:11.099 "name": "BaseBdev4", 00:17:11.099 "aliases": [ 00:17:11.099 "d6f35101-e1ef-432d-9a91-e8d5382e427d" 00:17:11.099 ], 00:17:11.099 "product_name": "Malloc disk", 00:17:11.099 "block_size": 512, 00:17:11.099 "num_blocks": 65536, 00:17:11.099 "uuid": "d6f35101-e1ef-432d-9a91-e8d5382e427d", 00:17:11.099 "assigned_rate_limits": { 00:17:11.099 "rw_ios_per_sec": 0, 00:17:11.099 "rw_mbytes_per_sec": 0, 00:17:11.099 "r_mbytes_per_sec": 0, 00:17:11.099 "w_mbytes_per_sec": 0 00:17:11.099 }, 00:17:11.099 "claimed": true, 00:17:11.099 "claim_type": "exclusive_write", 00:17:11.099 "zoned": false, 00:17:11.099 "supported_io_types": { 00:17:11.099 "read": true, 00:17:11.099 "write": true, 00:17:11.099 "unmap": true, 00:17:11.099 "write_zeroes": true, 00:17:11.099 "flush": true, 00:17:11.099 "reset": true, 00:17:11.099 "compare": false, 00:17:11.099 "compare_and_write": false, 00:17:11.099 "abort": true, 00:17:11.099 "nvme_admin": false, 00:17:11.099 "nvme_io": false 00:17:11.099 }, 00:17:11.099 "memory_domains": [ 00:17:11.099 { 00:17:11.099 "dma_device_id": "system", 00:17:11.099 "dma_device_type": 1 00:17:11.099 }, 00:17:11.099 { 00:17:11.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.099 "dma_device_type": 2 00:17:11.099 } 00:17:11.099 ], 00:17:11.099 "driver_specific": {} 00:17:11.099 } 00:17:11.099 ] 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.099 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.358 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:11.358 "name": "Existed_Raid", 00:17:11.358 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:11.358 "strip_size_kb": 64, 00:17:11.358 "state": "online", 00:17:11.358 "raid_level": "concat", 00:17:11.358 "superblock": true, 00:17:11.358 "num_base_bdevs": 4, 00:17:11.358 "num_base_bdevs_discovered": 4, 00:17:11.358 "num_base_bdevs_operational": 4, 00:17:11.358 "base_bdevs_list": [ 00:17:11.358 { 00:17:11.358 "name": "BaseBdev1", 00:17:11.358 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:11.358 "is_configured": true, 00:17:11.358 "data_offset": 2048, 00:17:11.358 "data_size": 63488 00:17:11.358 }, 00:17:11.358 { 00:17:11.358 "name": "BaseBdev2", 00:17:11.358 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:11.358 "is_configured": true, 00:17:11.358 "data_offset": 2048, 00:17:11.358 "data_size": 63488 00:17:11.358 }, 00:17:11.358 { 00:17:11.358 "name": "BaseBdev3", 00:17:11.358 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:11.358 "is_configured": true, 00:17:11.358 "data_offset": 2048, 00:17:11.358 "data_size": 63488 00:17:11.358 }, 00:17:11.358 { 00:17:11.358 "name": "BaseBdev4", 00:17:11.358 "uuid": "d6f35101-e1ef-432d-9a91-e8d5382e427d", 00:17:11.358 "is_configured": true, 00:17:11.358 "data_offset": 2048, 00:17:11.358 "data_size": 63488 00:17:11.358 } 00:17:11.358 ] 00:17:11.358 }' 00:17:11.358 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:11.358 11:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:11.926 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:12.184 [2024-05-14 11:53:39.172114] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.184 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:12.184 "name": "Existed_Raid", 00:17:12.184 "aliases": [ 00:17:12.184 "b2ea393a-02fc-48eb-8e83-6aad6f9618e7" 00:17:12.184 ], 00:17:12.184 "product_name": "Raid Volume", 00:17:12.184 "block_size": 512, 00:17:12.184 "num_blocks": 253952, 00:17:12.184 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:12.184 "assigned_rate_limits": { 00:17:12.184 "rw_ios_per_sec": 0, 00:17:12.184 "rw_mbytes_per_sec": 0, 00:17:12.184 "r_mbytes_per_sec": 0, 00:17:12.184 "w_mbytes_per_sec": 0 00:17:12.184 }, 00:17:12.184 "claimed": false, 00:17:12.184 "zoned": false, 00:17:12.184 "supported_io_types": { 00:17:12.184 "read": true, 00:17:12.184 "write": true, 00:17:12.184 "unmap": true, 00:17:12.184 "write_zeroes": true, 00:17:12.184 "flush": true, 00:17:12.184 "reset": true, 00:17:12.184 "compare": false, 00:17:12.184 "compare_and_write": false, 00:17:12.184 "abort": false, 00:17:12.184 "nvme_admin": false, 00:17:12.184 "nvme_io": false 00:17:12.184 }, 00:17:12.184 "memory_domains": [ 00:17:12.184 { 00:17:12.184 "dma_device_id": "system", 00:17:12.184 "dma_device_type": 1 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.184 "dma_device_type": 2 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "system", 00:17:12.184 "dma_device_type": 1 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.184 "dma_device_type": 2 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "system", 00:17:12.184 "dma_device_type": 1 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.184 "dma_device_type": 2 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "system", 00:17:12.184 "dma_device_type": 1 00:17:12.184 }, 00:17:12.184 { 00:17:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.184 "dma_device_type": 2 00:17:12.184 } 00:17:12.184 ], 00:17:12.184 "driver_specific": { 00:17:12.184 "raid": { 00:17:12.184 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:12.184 "strip_size_kb": 64, 00:17:12.184 "state": "online", 00:17:12.184 "raid_level": "concat", 00:17:12.184 "superblock": true, 00:17:12.184 "num_base_bdevs": 4, 00:17:12.184 "num_base_bdevs_discovered": 4, 00:17:12.184 "num_base_bdevs_operational": 4, 00:17:12.184 "base_bdevs_list": [ 00:17:12.184 { 00:17:12.184 "name": "BaseBdev1", 00:17:12.184 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:12.184 "is_configured": true, 00:17:12.184 "data_offset": 2048, 00:17:12.185 "data_size": 63488 00:17:12.185 }, 00:17:12.185 { 00:17:12.185 "name": "BaseBdev2", 00:17:12.185 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:12.185 "is_configured": true, 00:17:12.185 "data_offset": 2048, 00:17:12.185 "data_size": 63488 00:17:12.185 }, 00:17:12.185 { 00:17:12.185 "name": "BaseBdev3", 00:17:12.185 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:12.185 "is_configured": true, 00:17:12.185 "data_offset": 2048, 00:17:12.185 "data_size": 63488 00:17:12.185 }, 00:17:12.185 { 00:17:12.185 "name": "BaseBdev4", 00:17:12.185 "uuid": "d6f35101-e1ef-432d-9a91-e8d5382e427d", 00:17:12.185 "is_configured": true, 00:17:12.185 "data_offset": 2048, 00:17:12.185 "data_size": 63488 00:17:12.185 } 00:17:12.185 ] 00:17:12.185 } 00:17:12.185 } 00:17:12.185 }' 00:17:12.185 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:12.185 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:17:12.185 BaseBdev2 00:17:12.185 BaseBdev3 00:17:12.185 BaseBdev4' 00:17:12.185 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:12.185 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:12.185 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:12.444 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:12.444 "name": "BaseBdev1", 00:17:12.444 "aliases": [ 00:17:12.444 "c6204f20-1a3e-4506-8ac0-84914f41d005" 00:17:12.444 ], 00:17:12.444 "product_name": "Malloc disk", 00:17:12.444 "block_size": 512, 00:17:12.444 "num_blocks": 65536, 00:17:12.444 "uuid": "c6204f20-1a3e-4506-8ac0-84914f41d005", 00:17:12.444 "assigned_rate_limits": { 00:17:12.444 "rw_ios_per_sec": 0, 00:17:12.444 "rw_mbytes_per_sec": 0, 00:17:12.444 "r_mbytes_per_sec": 0, 00:17:12.444 "w_mbytes_per_sec": 0 00:17:12.444 }, 00:17:12.444 "claimed": true, 00:17:12.444 "claim_type": "exclusive_write", 00:17:12.444 "zoned": false, 00:17:12.444 "supported_io_types": { 00:17:12.444 "read": true, 00:17:12.444 "write": true, 00:17:12.444 "unmap": true, 00:17:12.444 "write_zeroes": true, 00:17:12.444 "flush": true, 00:17:12.444 "reset": true, 00:17:12.444 "compare": false, 00:17:12.444 "compare_and_write": false, 00:17:12.444 "abort": true, 00:17:12.444 "nvme_admin": false, 00:17:12.444 "nvme_io": false 00:17:12.444 }, 00:17:12.444 "memory_domains": [ 00:17:12.444 { 00:17:12.444 "dma_device_id": "system", 00:17:12.444 "dma_device_type": 1 00:17:12.444 }, 00:17:12.444 { 00:17:12.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.444 "dma_device_type": 2 00:17:12.444 } 00:17:12.444 ], 00:17:12.444 "driver_specific": {} 00:17:12.444 }' 00:17:12.444 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:12.444 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:12.444 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:12.444 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:12.774 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:13.031 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:13.031 "name": "BaseBdev2", 00:17:13.031 "aliases": [ 00:17:13.031 "9cb75d0d-eed0-4aae-a468-b57e7543c90d" 00:17:13.031 ], 00:17:13.031 "product_name": "Malloc disk", 00:17:13.031 "block_size": 512, 00:17:13.031 "num_blocks": 65536, 00:17:13.031 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:13.031 "assigned_rate_limits": { 00:17:13.031 "rw_ios_per_sec": 0, 00:17:13.031 "rw_mbytes_per_sec": 0, 00:17:13.031 "r_mbytes_per_sec": 0, 00:17:13.031 "w_mbytes_per_sec": 0 00:17:13.031 }, 00:17:13.031 "claimed": true, 00:17:13.031 "claim_type": "exclusive_write", 00:17:13.031 "zoned": false, 00:17:13.031 "supported_io_types": { 00:17:13.031 "read": true, 00:17:13.031 "write": true, 00:17:13.031 "unmap": true, 00:17:13.031 "write_zeroes": true, 00:17:13.032 "flush": true, 00:17:13.032 "reset": true, 00:17:13.032 "compare": false, 00:17:13.032 "compare_and_write": false, 00:17:13.032 "abort": true, 00:17:13.032 "nvme_admin": false, 00:17:13.032 "nvme_io": false 00:17:13.032 }, 00:17:13.032 "memory_domains": [ 00:17:13.032 { 00:17:13.032 "dma_device_id": "system", 00:17:13.032 "dma_device_type": 1 00:17:13.032 }, 00:17:13.032 { 00:17:13.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.032 "dma_device_type": 2 00:17:13.032 } 00:17:13.032 ], 00:17:13.032 "driver_specific": {} 00:17:13.032 }' 00:17:13.032 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.032 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.032 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:13.032 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:13.290 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:13.549 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:13.549 "name": "BaseBdev3", 00:17:13.549 "aliases": [ 00:17:13.549 "e9b34022-2b5c-4919-b3f9-efd72fb392d1" 00:17:13.549 ], 00:17:13.549 "product_name": "Malloc disk", 00:17:13.549 "block_size": 512, 00:17:13.549 "num_blocks": 65536, 00:17:13.549 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:13.549 "assigned_rate_limits": { 00:17:13.549 "rw_ios_per_sec": 0, 00:17:13.549 "rw_mbytes_per_sec": 0, 00:17:13.549 "r_mbytes_per_sec": 0, 00:17:13.549 "w_mbytes_per_sec": 0 00:17:13.549 }, 00:17:13.549 "claimed": true, 00:17:13.549 "claim_type": "exclusive_write", 00:17:13.549 "zoned": false, 00:17:13.549 "supported_io_types": { 00:17:13.549 "read": true, 00:17:13.549 "write": true, 00:17:13.549 "unmap": true, 00:17:13.549 "write_zeroes": true, 00:17:13.549 "flush": true, 00:17:13.549 "reset": true, 00:17:13.549 "compare": false, 00:17:13.549 "compare_and_write": false, 00:17:13.549 "abort": true, 00:17:13.549 "nvme_admin": false, 00:17:13.549 "nvme_io": false 00:17:13.549 }, 00:17:13.549 "memory_domains": [ 00:17:13.549 { 00:17:13.549 "dma_device_id": "system", 00:17:13.549 "dma_device_type": 1 00:17:13.549 }, 00:17:13.549 { 00:17:13.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.549 "dma_device_type": 2 00:17:13.549 } 00:17:13.549 ], 00:17:13.549 "driver_specific": {} 00:17:13.549 }' 00:17:13.549 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.549 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:13.549 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:13.549 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:13.807 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:14.067 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:14.067 "name": "BaseBdev4", 00:17:14.067 "aliases": [ 00:17:14.067 "d6f35101-e1ef-432d-9a91-e8d5382e427d" 00:17:14.067 ], 00:17:14.067 "product_name": "Malloc disk", 00:17:14.067 "block_size": 512, 00:17:14.067 "num_blocks": 65536, 00:17:14.067 "uuid": "d6f35101-e1ef-432d-9a91-e8d5382e427d", 00:17:14.067 "assigned_rate_limits": { 00:17:14.067 "rw_ios_per_sec": 0, 00:17:14.067 "rw_mbytes_per_sec": 0, 00:17:14.067 "r_mbytes_per_sec": 0, 00:17:14.067 "w_mbytes_per_sec": 0 00:17:14.067 }, 00:17:14.067 "claimed": true, 00:17:14.067 "claim_type": "exclusive_write", 00:17:14.067 "zoned": false, 00:17:14.067 "supported_io_types": { 00:17:14.067 "read": true, 00:17:14.067 "write": true, 00:17:14.067 "unmap": true, 00:17:14.067 "write_zeroes": true, 00:17:14.067 "flush": true, 00:17:14.067 "reset": true, 00:17:14.067 "compare": false, 00:17:14.067 "compare_and_write": false, 00:17:14.067 "abort": true, 00:17:14.067 "nvme_admin": false, 00:17:14.067 "nvme_io": false 00:17:14.067 }, 00:17:14.067 "memory_domains": [ 00:17:14.067 { 00:17:14.067 "dma_device_id": "system", 00:17:14.067 "dma_device_type": 1 00:17:14.067 }, 00:17:14.067 { 00:17:14.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.067 "dma_device_type": 2 00:17:14.067 } 00:17:14.067 ], 00:17:14.067 "driver_specific": {} 00:17:14.067 }' 00:17:14.067 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.067 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.326 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.584 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:14.585 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:14.585 [2024-05-14 11:53:41.670538] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:14.585 [2024-05-14 11:53:41.670569] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:14.844 [2024-05-14 11:53:41.670626] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.844 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.103 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:15.103 "name": "Existed_Raid", 00:17:15.103 "uuid": "b2ea393a-02fc-48eb-8e83-6aad6f9618e7", 00:17:15.103 "strip_size_kb": 64, 00:17:15.103 "state": "offline", 00:17:15.103 "raid_level": "concat", 00:17:15.103 "superblock": true, 00:17:15.103 "num_base_bdevs": 4, 00:17:15.103 "num_base_bdevs_discovered": 3, 00:17:15.103 "num_base_bdevs_operational": 3, 00:17:15.103 "base_bdevs_list": [ 00:17:15.103 { 00:17:15.103 "name": null, 00:17:15.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.103 "is_configured": false, 00:17:15.103 "data_offset": 2048, 00:17:15.103 "data_size": 63488 00:17:15.103 }, 00:17:15.103 { 00:17:15.103 "name": "BaseBdev2", 00:17:15.103 "uuid": "9cb75d0d-eed0-4aae-a468-b57e7543c90d", 00:17:15.103 "is_configured": true, 00:17:15.103 "data_offset": 2048, 00:17:15.103 "data_size": 63488 00:17:15.103 }, 00:17:15.103 { 00:17:15.103 "name": "BaseBdev3", 00:17:15.103 "uuid": "e9b34022-2b5c-4919-b3f9-efd72fb392d1", 00:17:15.103 "is_configured": true, 00:17:15.103 "data_offset": 2048, 00:17:15.103 "data_size": 63488 00:17:15.103 }, 00:17:15.103 { 00:17:15.103 "name": "BaseBdev4", 00:17:15.103 "uuid": "d6f35101-e1ef-432d-9a91-e8d5382e427d", 00:17:15.103 "is_configured": true, 00:17:15.103 "data_offset": 2048, 00:17:15.103 "data_size": 63488 00:17:15.103 } 00:17:15.103 ] 00:17:15.103 }' 00:17:15.103 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:15.103 11:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.671 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:17:15.671 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:15.671 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.671 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:15.930 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:15.930 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:15.930 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:15.930 [2024-05-14 11:53:42.995199] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:16.189 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:16.449 [2024-05-14 11:53:43.487101] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:16.449 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:16.449 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.449 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.449 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:16.706 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:16.706 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:16.706 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:16.964 [2024-05-14 11:53:43.931161] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:16.964 [2024-05-14 11:53:43.931209] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25941b0 name Existed_Raid, state offline 00:17:16.964 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:16.964 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.964 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.964 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:17.221 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:17.479 BaseBdev2 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:17.479 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:17.737 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:17.995 [ 00:17:17.995 { 00:17:17.995 "name": "BaseBdev2", 00:17:17.995 "aliases": [ 00:17:17.995 "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5" 00:17:17.995 ], 00:17:17.995 "product_name": "Malloc disk", 00:17:17.995 "block_size": 512, 00:17:17.995 "num_blocks": 65536, 00:17:17.995 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:17.995 "assigned_rate_limits": { 00:17:17.995 "rw_ios_per_sec": 0, 00:17:17.995 "rw_mbytes_per_sec": 0, 00:17:17.995 "r_mbytes_per_sec": 0, 00:17:17.995 "w_mbytes_per_sec": 0 00:17:17.995 }, 00:17:17.995 "claimed": false, 00:17:17.996 "zoned": false, 00:17:17.996 "supported_io_types": { 00:17:17.996 "read": true, 00:17:17.996 "write": true, 00:17:17.996 "unmap": true, 00:17:17.996 "write_zeroes": true, 00:17:17.996 "flush": true, 00:17:17.996 "reset": true, 00:17:17.996 "compare": false, 00:17:17.996 "compare_and_write": false, 00:17:17.996 "abort": true, 00:17:17.996 "nvme_admin": false, 00:17:17.996 "nvme_io": false 00:17:17.996 }, 00:17:17.996 "memory_domains": [ 00:17:17.996 { 00:17:17.996 "dma_device_id": "system", 00:17:17.996 "dma_device_type": 1 00:17:17.996 }, 00:17:17.996 { 00:17:17.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.996 "dma_device_type": 2 00:17:17.996 } 00:17:17.996 ], 00:17:17.996 "driver_specific": {} 00:17:17.996 } 00:17:17.996 ] 00:17:17.996 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:17.996 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:17.996 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:17.996 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:18.254 BaseBdev3 00:17:18.254 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:17:18.254 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:18.254 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:18.254 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:18.254 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:18.255 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:18.255 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:18.513 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:18.772 [ 00:17:18.772 { 00:17:18.772 "name": "BaseBdev3", 00:17:18.772 "aliases": [ 00:17:18.772 "f2e24bda-59f5-48b5-9735-405c4d195a87" 00:17:18.772 ], 00:17:18.772 "product_name": "Malloc disk", 00:17:18.772 "block_size": 512, 00:17:18.772 "num_blocks": 65536, 00:17:18.772 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:18.772 "assigned_rate_limits": { 00:17:18.772 "rw_ios_per_sec": 0, 00:17:18.772 "rw_mbytes_per_sec": 0, 00:17:18.772 "r_mbytes_per_sec": 0, 00:17:18.772 "w_mbytes_per_sec": 0 00:17:18.772 }, 00:17:18.772 "claimed": false, 00:17:18.772 "zoned": false, 00:17:18.772 "supported_io_types": { 00:17:18.772 "read": true, 00:17:18.772 "write": true, 00:17:18.772 "unmap": true, 00:17:18.772 "write_zeroes": true, 00:17:18.772 "flush": true, 00:17:18.772 "reset": true, 00:17:18.772 "compare": false, 00:17:18.772 "compare_and_write": false, 00:17:18.772 "abort": true, 00:17:18.772 "nvme_admin": false, 00:17:18.772 "nvme_io": false 00:17:18.772 }, 00:17:18.772 "memory_domains": [ 00:17:18.772 { 00:17:18.772 "dma_device_id": "system", 00:17:18.772 "dma_device_type": 1 00:17:18.772 }, 00:17:18.772 { 00:17:18.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.772 "dma_device_type": 2 00:17:18.772 } 00:17:18.772 ], 00:17:18.772 "driver_specific": {} 00:17:18.772 } 00:17:18.772 ] 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:18.772 BaseBdev4 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:18.772 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:19.031 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:19.031 11:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.031 11:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:19.289 [ 00:17:19.289 { 00:17:19.289 "name": "BaseBdev4", 00:17:19.289 "aliases": [ 00:17:19.289 "8cec04be-476d-45f1-9001-57355c8119c5" 00:17:19.289 ], 00:17:19.289 "product_name": "Malloc disk", 00:17:19.289 "block_size": 512, 00:17:19.289 "num_blocks": 65536, 00:17:19.289 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:19.289 "assigned_rate_limits": { 00:17:19.289 "rw_ios_per_sec": 0, 00:17:19.289 "rw_mbytes_per_sec": 0, 00:17:19.289 "r_mbytes_per_sec": 0, 00:17:19.289 "w_mbytes_per_sec": 0 00:17:19.289 }, 00:17:19.289 "claimed": false, 00:17:19.289 "zoned": false, 00:17:19.289 "supported_io_types": { 00:17:19.289 "read": true, 00:17:19.289 "write": true, 00:17:19.289 "unmap": true, 00:17:19.289 "write_zeroes": true, 00:17:19.289 "flush": true, 00:17:19.289 "reset": true, 00:17:19.289 "compare": false, 00:17:19.289 "compare_and_write": false, 00:17:19.289 "abort": true, 00:17:19.289 "nvme_admin": false, 00:17:19.289 "nvme_io": false 00:17:19.289 }, 00:17:19.289 "memory_domains": [ 00:17:19.289 { 00:17:19.289 "dma_device_id": "system", 00:17:19.289 "dma_device_type": 1 00:17:19.289 }, 00:17:19.289 { 00:17:19.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.290 "dma_device_type": 2 00:17:19.290 } 00:17:19.290 ], 00:17:19.290 "driver_specific": {} 00:17:19.290 } 00:17:19.290 ] 00:17:19.290 11:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:19.290 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:19.290 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:19.290 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:19.548 [2024-05-14 11:53:46.541080] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:19.548 [2024-05-14 11:53:46.541122] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:19.548 [2024-05-14 11:53:46.541142] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:19.548 [2024-05-14 11:53:46.542478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:19.548 [2024-05-14 11:53:46.542520] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.548 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.806 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:19.806 "name": "Existed_Raid", 00:17:19.806 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:19.806 "strip_size_kb": 64, 00:17:19.806 "state": "configuring", 00:17:19.806 "raid_level": "concat", 00:17:19.806 "superblock": true, 00:17:19.806 "num_base_bdevs": 4, 00:17:19.806 "num_base_bdevs_discovered": 3, 00:17:19.806 "num_base_bdevs_operational": 4, 00:17:19.806 "base_bdevs_list": [ 00:17:19.806 { 00:17:19.806 "name": "BaseBdev1", 00:17:19.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.806 "is_configured": false, 00:17:19.806 "data_offset": 0, 00:17:19.806 "data_size": 0 00:17:19.806 }, 00:17:19.806 { 00:17:19.806 "name": "BaseBdev2", 00:17:19.806 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:19.806 "is_configured": true, 00:17:19.806 "data_offset": 2048, 00:17:19.806 "data_size": 63488 00:17:19.806 }, 00:17:19.806 { 00:17:19.806 "name": "BaseBdev3", 00:17:19.806 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:19.806 "is_configured": true, 00:17:19.806 "data_offset": 2048, 00:17:19.806 "data_size": 63488 00:17:19.806 }, 00:17:19.806 { 00:17:19.806 "name": "BaseBdev4", 00:17:19.806 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:19.806 "is_configured": true, 00:17:19.806 "data_offset": 2048, 00:17:19.806 "data_size": 63488 00:17:19.807 } 00:17:19.807 ] 00:17:19.807 }' 00:17:19.807 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:19.807 11:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.375 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:20.633 [2024-05-14 11:53:47.567763] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.633 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.891 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:20.891 "name": "Existed_Raid", 00:17:20.891 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:20.891 "strip_size_kb": 64, 00:17:20.891 "state": "configuring", 00:17:20.891 "raid_level": "concat", 00:17:20.891 "superblock": true, 00:17:20.891 "num_base_bdevs": 4, 00:17:20.891 "num_base_bdevs_discovered": 2, 00:17:20.891 "num_base_bdevs_operational": 4, 00:17:20.891 "base_bdevs_list": [ 00:17:20.891 { 00:17:20.891 "name": "BaseBdev1", 00:17:20.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.891 "is_configured": false, 00:17:20.891 "data_offset": 0, 00:17:20.891 "data_size": 0 00:17:20.891 }, 00:17:20.891 { 00:17:20.891 "name": null, 00:17:20.891 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:20.891 "is_configured": false, 00:17:20.891 "data_offset": 2048, 00:17:20.891 "data_size": 63488 00:17:20.891 }, 00:17:20.891 { 00:17:20.891 "name": "BaseBdev3", 00:17:20.891 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:20.891 "is_configured": true, 00:17:20.891 "data_offset": 2048, 00:17:20.891 "data_size": 63488 00:17:20.891 }, 00:17:20.891 { 00:17:20.891 "name": "BaseBdev4", 00:17:20.891 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:20.892 "is_configured": true, 00:17:20.892 "data_offset": 2048, 00:17:20.892 "data_size": 63488 00:17:20.892 } 00:17:20.892 ] 00:17:20.892 }' 00:17:20.892 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:20.892 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.458 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.458 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:21.716 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:17:21.716 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:21.974 [2024-05-14 11:53:48.850679] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:21.974 BaseBdev1 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:21.974 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.232 11:53:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:22.491 [ 00:17:22.491 { 00:17:22.491 "name": "BaseBdev1", 00:17:22.491 "aliases": [ 00:17:22.491 "4c17002a-86d9-4e89-8661-396a155c6244" 00:17:22.491 ], 00:17:22.491 "product_name": "Malloc disk", 00:17:22.491 "block_size": 512, 00:17:22.491 "num_blocks": 65536, 00:17:22.491 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:22.491 "assigned_rate_limits": { 00:17:22.491 "rw_ios_per_sec": 0, 00:17:22.491 "rw_mbytes_per_sec": 0, 00:17:22.491 "r_mbytes_per_sec": 0, 00:17:22.491 "w_mbytes_per_sec": 0 00:17:22.491 }, 00:17:22.491 "claimed": true, 00:17:22.491 "claim_type": "exclusive_write", 00:17:22.491 "zoned": false, 00:17:22.491 "supported_io_types": { 00:17:22.491 "read": true, 00:17:22.491 "write": true, 00:17:22.491 "unmap": true, 00:17:22.491 "write_zeroes": true, 00:17:22.491 "flush": true, 00:17:22.491 "reset": true, 00:17:22.491 "compare": false, 00:17:22.491 "compare_and_write": false, 00:17:22.491 "abort": true, 00:17:22.491 "nvme_admin": false, 00:17:22.491 "nvme_io": false 00:17:22.491 }, 00:17:22.491 "memory_domains": [ 00:17:22.491 { 00:17:22.491 "dma_device_id": "system", 00:17:22.491 "dma_device_type": 1 00:17:22.491 }, 00:17:22.491 { 00:17:22.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.491 "dma_device_type": 2 00:17:22.491 } 00:17:22.491 ], 00:17:22.491 "driver_specific": {} 00:17:22.491 } 00:17:22.491 ] 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.491 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.750 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:22.750 "name": "Existed_Raid", 00:17:22.750 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:22.750 "strip_size_kb": 64, 00:17:22.750 "state": "configuring", 00:17:22.750 "raid_level": "concat", 00:17:22.750 "superblock": true, 00:17:22.750 "num_base_bdevs": 4, 00:17:22.750 "num_base_bdevs_discovered": 3, 00:17:22.750 "num_base_bdevs_operational": 4, 00:17:22.750 "base_bdevs_list": [ 00:17:22.750 { 00:17:22.750 "name": "BaseBdev1", 00:17:22.750 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:22.750 "is_configured": true, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 }, 00:17:22.750 { 00:17:22.750 "name": null, 00:17:22.750 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:22.750 "is_configured": false, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 }, 00:17:22.750 { 00:17:22.750 "name": "BaseBdev3", 00:17:22.750 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:22.750 "is_configured": true, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 }, 00:17:22.750 { 00:17:22.750 "name": "BaseBdev4", 00:17:22.750 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:22.750 "is_configured": true, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 } 00:17:22.750 ] 00:17:22.750 }' 00:17:22.750 11:53:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:22.750 11:53:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.318 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.319 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:23.319 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:17:23.319 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:23.577 [2024-05-14 11:53:50.599348] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.577 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.836 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:23.836 "name": "Existed_Raid", 00:17:23.836 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:23.836 "strip_size_kb": 64, 00:17:23.836 "state": "configuring", 00:17:23.836 "raid_level": "concat", 00:17:23.836 "superblock": true, 00:17:23.836 "num_base_bdevs": 4, 00:17:23.836 "num_base_bdevs_discovered": 2, 00:17:23.836 "num_base_bdevs_operational": 4, 00:17:23.836 "base_bdevs_list": [ 00:17:23.836 { 00:17:23.836 "name": "BaseBdev1", 00:17:23.836 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:23.836 "is_configured": true, 00:17:23.836 "data_offset": 2048, 00:17:23.836 "data_size": 63488 00:17:23.836 }, 00:17:23.836 { 00:17:23.836 "name": null, 00:17:23.836 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:23.836 "is_configured": false, 00:17:23.836 "data_offset": 2048, 00:17:23.836 "data_size": 63488 00:17:23.836 }, 00:17:23.836 { 00:17:23.836 "name": null, 00:17:23.836 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:23.836 "is_configured": false, 00:17:23.836 "data_offset": 2048, 00:17:23.836 "data_size": 63488 00:17:23.836 }, 00:17:23.836 { 00:17:23.836 "name": "BaseBdev4", 00:17:23.836 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:23.836 "is_configured": true, 00:17:23.836 "data_offset": 2048, 00:17:23.836 "data_size": 63488 00:17:23.836 } 00:17:23.836 ] 00:17:23.836 }' 00:17:23.836 11:53:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:23.836 11:53:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.404 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:24.404 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.663 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:17:24.663 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:24.921 [2024-05-14 11:53:51.854694] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.921 11:53:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.181 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:25.181 "name": "Existed_Raid", 00:17:25.181 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:25.181 "strip_size_kb": 64, 00:17:25.181 "state": "configuring", 00:17:25.181 "raid_level": "concat", 00:17:25.181 "superblock": true, 00:17:25.181 "num_base_bdevs": 4, 00:17:25.181 "num_base_bdevs_discovered": 3, 00:17:25.181 "num_base_bdevs_operational": 4, 00:17:25.181 "base_bdevs_list": [ 00:17:25.181 { 00:17:25.181 "name": "BaseBdev1", 00:17:25.181 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:25.181 "is_configured": true, 00:17:25.181 "data_offset": 2048, 00:17:25.181 "data_size": 63488 00:17:25.181 }, 00:17:25.181 { 00:17:25.181 "name": null, 00:17:25.181 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:25.181 "is_configured": false, 00:17:25.181 "data_offset": 2048, 00:17:25.181 "data_size": 63488 00:17:25.181 }, 00:17:25.181 { 00:17:25.181 "name": "BaseBdev3", 00:17:25.181 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:25.181 "is_configured": true, 00:17:25.181 "data_offset": 2048, 00:17:25.181 "data_size": 63488 00:17:25.181 }, 00:17:25.181 { 00:17:25.181 "name": "BaseBdev4", 00:17:25.181 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:25.181 "is_configured": true, 00:17:25.181 "data_offset": 2048, 00:17:25.181 "data_size": 63488 00:17:25.181 } 00:17:25.181 ] 00:17:25.181 }' 00:17:25.181 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:25.181 11:53:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:25.750 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.750 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:26.011 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:17:26.011 11:53:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:26.011 [2024-05-14 11:53:53.069929] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:26.271 "name": "Existed_Raid", 00:17:26.271 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:26.271 "strip_size_kb": 64, 00:17:26.271 "state": "configuring", 00:17:26.271 "raid_level": "concat", 00:17:26.271 "superblock": true, 00:17:26.271 "num_base_bdevs": 4, 00:17:26.271 "num_base_bdevs_discovered": 2, 00:17:26.271 "num_base_bdevs_operational": 4, 00:17:26.271 "base_bdevs_list": [ 00:17:26.271 { 00:17:26.271 "name": null, 00:17:26.271 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:26.271 "is_configured": false, 00:17:26.271 "data_offset": 2048, 00:17:26.271 "data_size": 63488 00:17:26.271 }, 00:17:26.271 { 00:17:26.271 "name": null, 00:17:26.271 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:26.271 "is_configured": false, 00:17:26.271 "data_offset": 2048, 00:17:26.271 "data_size": 63488 00:17:26.271 }, 00:17:26.271 { 00:17:26.271 "name": "BaseBdev3", 00:17:26.271 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:26.271 "is_configured": true, 00:17:26.271 "data_offset": 2048, 00:17:26.271 "data_size": 63488 00:17:26.271 }, 00:17:26.271 { 00:17:26.271 "name": "BaseBdev4", 00:17:26.271 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:26.271 "is_configured": true, 00:17:26.271 "data_offset": 2048, 00:17:26.271 "data_size": 63488 00:17:26.271 } 00:17:26.271 ] 00:17:26.271 }' 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:26.271 11:53:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.875 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:26.875 11:53:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.134 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:17:27.134 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:27.394 [2024-05-14 11:53:54.390039] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.394 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.653 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:27.653 "name": "Existed_Raid", 00:17:27.653 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:27.653 "strip_size_kb": 64, 00:17:27.653 "state": "configuring", 00:17:27.653 "raid_level": "concat", 00:17:27.653 "superblock": true, 00:17:27.653 "num_base_bdevs": 4, 00:17:27.653 "num_base_bdevs_discovered": 3, 00:17:27.653 "num_base_bdevs_operational": 4, 00:17:27.653 "base_bdevs_list": [ 00:17:27.653 { 00:17:27.653 "name": null, 00:17:27.653 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:27.653 "is_configured": false, 00:17:27.653 "data_offset": 2048, 00:17:27.653 "data_size": 63488 00:17:27.653 }, 00:17:27.653 { 00:17:27.653 "name": "BaseBdev2", 00:17:27.653 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:27.653 "is_configured": true, 00:17:27.653 "data_offset": 2048, 00:17:27.653 "data_size": 63488 00:17:27.653 }, 00:17:27.653 { 00:17:27.653 "name": "BaseBdev3", 00:17:27.653 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:27.653 "is_configured": true, 00:17:27.653 "data_offset": 2048, 00:17:27.653 "data_size": 63488 00:17:27.653 }, 00:17:27.653 { 00:17:27.653 "name": "BaseBdev4", 00:17:27.653 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:27.653 "is_configured": true, 00:17:27.653 "data_offset": 2048, 00:17:27.653 "data_size": 63488 00:17:27.653 } 00:17:27.653 ] 00:17:27.653 }' 00:17:27.653 11:53:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:27.653 11:53:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.220 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.220 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:28.480 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:17:28.480 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.480 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:28.740 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4c17002a-86d9-4e89-8661-396a155c6244 00:17:28.999 [2024-05-14 11:53:55.946854] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:28.999 [2024-05-14 11:53:55.947030] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2745920 00:17:28.999 [2024-05-14 11:53:55.947043] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:28.999 [2024-05-14 11:53:55.947233] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24addf0 00:17:28.999 [2024-05-14 11:53:55.947357] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2745920 00:17:28.999 [2024-05-14 11:53:55.947367] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2745920 00:17:28.999 [2024-05-14 11:53:55.947477] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:28.999 NewBaseBdev 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:28.999 11:53:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.258 11:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:29.518 [ 00:17:29.518 { 00:17:29.518 "name": "NewBaseBdev", 00:17:29.518 "aliases": [ 00:17:29.518 "4c17002a-86d9-4e89-8661-396a155c6244" 00:17:29.518 ], 00:17:29.518 "product_name": "Malloc disk", 00:17:29.518 "block_size": 512, 00:17:29.518 "num_blocks": 65536, 00:17:29.518 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:29.518 "assigned_rate_limits": { 00:17:29.518 "rw_ios_per_sec": 0, 00:17:29.518 "rw_mbytes_per_sec": 0, 00:17:29.518 "r_mbytes_per_sec": 0, 00:17:29.518 "w_mbytes_per_sec": 0 00:17:29.518 }, 00:17:29.518 "claimed": true, 00:17:29.518 "claim_type": "exclusive_write", 00:17:29.518 "zoned": false, 00:17:29.518 "supported_io_types": { 00:17:29.518 "read": true, 00:17:29.518 "write": true, 00:17:29.518 "unmap": true, 00:17:29.518 "write_zeroes": true, 00:17:29.518 "flush": true, 00:17:29.518 "reset": true, 00:17:29.518 "compare": false, 00:17:29.518 "compare_and_write": false, 00:17:29.518 "abort": true, 00:17:29.518 "nvme_admin": false, 00:17:29.518 "nvme_io": false 00:17:29.518 }, 00:17:29.518 "memory_domains": [ 00:17:29.518 { 00:17:29.518 "dma_device_id": "system", 00:17:29.518 "dma_device_type": 1 00:17:29.518 }, 00:17:29.518 { 00:17:29.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.518 "dma_device_type": 2 00:17:29.518 } 00:17:29.518 ], 00:17:29.518 "driver_specific": {} 00:17:29.518 } 00:17:29.518 ] 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.518 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.778 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:29.778 "name": "Existed_Raid", 00:17:29.778 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:29.778 "strip_size_kb": 64, 00:17:29.778 "state": "online", 00:17:29.778 "raid_level": "concat", 00:17:29.778 "superblock": true, 00:17:29.778 "num_base_bdevs": 4, 00:17:29.778 "num_base_bdevs_discovered": 4, 00:17:29.778 "num_base_bdevs_operational": 4, 00:17:29.778 "base_bdevs_list": [ 00:17:29.778 { 00:17:29.778 "name": "NewBaseBdev", 00:17:29.778 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:29.778 "is_configured": true, 00:17:29.778 "data_offset": 2048, 00:17:29.778 "data_size": 63488 00:17:29.778 }, 00:17:29.778 { 00:17:29.778 "name": "BaseBdev2", 00:17:29.778 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:29.778 "is_configured": true, 00:17:29.778 "data_offset": 2048, 00:17:29.778 "data_size": 63488 00:17:29.778 }, 00:17:29.778 { 00:17:29.778 "name": "BaseBdev3", 00:17:29.778 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:29.778 "is_configured": true, 00:17:29.778 "data_offset": 2048, 00:17:29.778 "data_size": 63488 00:17:29.778 }, 00:17:29.778 { 00:17:29.778 "name": "BaseBdev4", 00:17:29.778 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:29.778 "is_configured": true, 00:17:29.778 "data_offset": 2048, 00:17:29.778 "data_size": 63488 00:17:29.778 } 00:17:29.778 ] 00:17:29.778 }' 00:17:29.778 11:53:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:29.778 11:53:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:30.346 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:30.346 [2024-05-14 11:53:57.423093] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:30.606 "name": "Existed_Raid", 00:17:30.606 "aliases": [ 00:17:30.606 "19454ede-0ca8-4d47-a5d3-c89ec275efa2" 00:17:30.606 ], 00:17:30.606 "product_name": "Raid Volume", 00:17:30.606 "block_size": 512, 00:17:30.606 "num_blocks": 253952, 00:17:30.606 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:30.606 "assigned_rate_limits": { 00:17:30.606 "rw_ios_per_sec": 0, 00:17:30.606 "rw_mbytes_per_sec": 0, 00:17:30.606 "r_mbytes_per_sec": 0, 00:17:30.606 "w_mbytes_per_sec": 0 00:17:30.606 }, 00:17:30.606 "claimed": false, 00:17:30.606 "zoned": false, 00:17:30.606 "supported_io_types": { 00:17:30.606 "read": true, 00:17:30.606 "write": true, 00:17:30.606 "unmap": true, 00:17:30.606 "write_zeroes": true, 00:17:30.606 "flush": true, 00:17:30.606 "reset": true, 00:17:30.606 "compare": false, 00:17:30.606 "compare_and_write": false, 00:17:30.606 "abort": false, 00:17:30.606 "nvme_admin": false, 00:17:30.606 "nvme_io": false 00:17:30.606 }, 00:17:30.606 "memory_domains": [ 00:17:30.606 { 00:17:30.606 "dma_device_id": "system", 00:17:30.606 "dma_device_type": 1 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.606 "dma_device_type": 2 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "system", 00:17:30.606 "dma_device_type": 1 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.606 "dma_device_type": 2 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "system", 00:17:30.606 "dma_device_type": 1 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.606 "dma_device_type": 2 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "system", 00:17:30.606 "dma_device_type": 1 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.606 "dma_device_type": 2 00:17:30.606 } 00:17:30.606 ], 00:17:30.606 "driver_specific": { 00:17:30.606 "raid": { 00:17:30.606 "uuid": "19454ede-0ca8-4d47-a5d3-c89ec275efa2", 00:17:30.606 "strip_size_kb": 64, 00:17:30.606 "state": "online", 00:17:30.606 "raid_level": "concat", 00:17:30.606 "superblock": true, 00:17:30.606 "num_base_bdevs": 4, 00:17:30.606 "num_base_bdevs_discovered": 4, 00:17:30.606 "num_base_bdevs_operational": 4, 00:17:30.606 "base_bdevs_list": [ 00:17:30.606 { 00:17:30.606 "name": "NewBaseBdev", 00:17:30.606 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:30.606 "is_configured": true, 00:17:30.606 "data_offset": 2048, 00:17:30.606 "data_size": 63488 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "name": "BaseBdev2", 00:17:30.606 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:30.606 "is_configured": true, 00:17:30.606 "data_offset": 2048, 00:17:30.606 "data_size": 63488 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "name": "BaseBdev3", 00:17:30.606 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:30.606 "is_configured": true, 00:17:30.606 "data_offset": 2048, 00:17:30.606 "data_size": 63488 00:17:30.606 }, 00:17:30.606 { 00:17:30.606 "name": "BaseBdev4", 00:17:30.606 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:30.606 "is_configured": true, 00:17:30.606 "data_offset": 2048, 00:17:30.606 "data_size": 63488 00:17:30.606 } 00:17:30.606 ] 00:17:30.606 } 00:17:30.606 } 00:17:30.606 }' 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:17:30.606 BaseBdev2 00:17:30.606 BaseBdev3 00:17:30.606 BaseBdev4' 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:30.606 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:30.606 "name": "NewBaseBdev", 00:17:30.606 "aliases": [ 00:17:30.606 "4c17002a-86d9-4e89-8661-396a155c6244" 00:17:30.606 ], 00:17:30.606 "product_name": "Malloc disk", 00:17:30.606 "block_size": 512, 00:17:30.606 "num_blocks": 65536, 00:17:30.606 "uuid": "4c17002a-86d9-4e89-8661-396a155c6244", 00:17:30.606 "assigned_rate_limits": { 00:17:30.606 "rw_ios_per_sec": 0, 00:17:30.606 "rw_mbytes_per_sec": 0, 00:17:30.606 "r_mbytes_per_sec": 0, 00:17:30.606 "w_mbytes_per_sec": 0 00:17:30.606 }, 00:17:30.606 "claimed": true, 00:17:30.606 "claim_type": "exclusive_write", 00:17:30.606 "zoned": false, 00:17:30.606 "supported_io_types": { 00:17:30.606 "read": true, 00:17:30.606 "write": true, 00:17:30.606 "unmap": true, 00:17:30.606 "write_zeroes": true, 00:17:30.606 "flush": true, 00:17:30.606 "reset": true, 00:17:30.606 "compare": false, 00:17:30.606 "compare_and_write": false, 00:17:30.606 "abort": true, 00:17:30.606 "nvme_admin": false, 00:17:30.607 "nvme_io": false 00:17:30.607 }, 00:17:30.607 "memory_domains": [ 00:17:30.607 { 00:17:30.607 "dma_device_id": "system", 00:17:30.607 "dma_device_type": 1 00:17:30.607 }, 00:17:30.607 { 00:17:30.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.607 "dma_device_type": 2 00:17:30.607 } 00:17:30.607 ], 00:17:30.607 "driver_specific": {} 00:17:30.607 }' 00:17:30.607 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:30.866 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:31.124 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:31.124 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:31.124 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:31.124 11:53:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:31.124 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:31.124 "name": "BaseBdev2", 00:17:31.124 "aliases": [ 00:17:31.124 "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5" 00:17:31.124 ], 00:17:31.124 "product_name": "Malloc disk", 00:17:31.124 "block_size": 512, 00:17:31.124 "num_blocks": 65536, 00:17:31.124 "uuid": "0b9b3859-a1f7-42bd-a25f-97e0d1b539d5", 00:17:31.124 "assigned_rate_limits": { 00:17:31.124 "rw_ios_per_sec": 0, 00:17:31.124 "rw_mbytes_per_sec": 0, 00:17:31.124 "r_mbytes_per_sec": 0, 00:17:31.124 "w_mbytes_per_sec": 0 00:17:31.124 }, 00:17:31.124 "claimed": true, 00:17:31.124 "claim_type": "exclusive_write", 00:17:31.124 "zoned": false, 00:17:31.124 "supported_io_types": { 00:17:31.124 "read": true, 00:17:31.124 "write": true, 00:17:31.124 "unmap": true, 00:17:31.124 "write_zeroes": true, 00:17:31.124 "flush": true, 00:17:31.124 "reset": true, 00:17:31.124 "compare": false, 00:17:31.124 "compare_and_write": false, 00:17:31.124 "abort": true, 00:17:31.124 "nvme_admin": false, 00:17:31.124 "nvme_io": false 00:17:31.124 }, 00:17:31.124 "memory_domains": [ 00:17:31.124 { 00:17:31.124 "dma_device_id": "system", 00:17:31.124 "dma_device_type": 1 00:17:31.124 }, 00:17:31.124 { 00:17:31.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.124 "dma_device_type": 2 00:17:31.124 } 00:17:31.124 ], 00:17:31.124 "driver_specific": {} 00:17:31.124 }' 00:17:31.124 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:31.383 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:31.642 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:31.901 "name": "BaseBdev3", 00:17:31.901 "aliases": [ 00:17:31.901 "f2e24bda-59f5-48b5-9735-405c4d195a87" 00:17:31.901 ], 00:17:31.901 "product_name": "Malloc disk", 00:17:31.901 "block_size": 512, 00:17:31.901 "num_blocks": 65536, 00:17:31.901 "uuid": "f2e24bda-59f5-48b5-9735-405c4d195a87", 00:17:31.901 "assigned_rate_limits": { 00:17:31.901 "rw_ios_per_sec": 0, 00:17:31.901 "rw_mbytes_per_sec": 0, 00:17:31.901 "r_mbytes_per_sec": 0, 00:17:31.901 "w_mbytes_per_sec": 0 00:17:31.901 }, 00:17:31.901 "claimed": true, 00:17:31.901 "claim_type": "exclusive_write", 00:17:31.901 "zoned": false, 00:17:31.901 "supported_io_types": { 00:17:31.901 "read": true, 00:17:31.901 "write": true, 00:17:31.901 "unmap": true, 00:17:31.901 "write_zeroes": true, 00:17:31.901 "flush": true, 00:17:31.901 "reset": true, 00:17:31.901 "compare": false, 00:17:31.901 "compare_and_write": false, 00:17:31.901 "abort": true, 00:17:31.901 "nvme_admin": false, 00:17:31.901 "nvme_io": false 00:17:31.901 }, 00:17:31.901 "memory_domains": [ 00:17:31.901 { 00:17:31.901 "dma_device_id": "system", 00:17:31.901 "dma_device_type": 1 00:17:31.901 }, 00:17:31.901 { 00:17:31.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.901 "dma_device_type": 2 00:17:31.901 } 00:17:31.901 ], 00:17:31.901 "driver_specific": {} 00:17:31.901 }' 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:31.901 11:53:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:32.160 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:32.419 "name": "BaseBdev4", 00:17:32.419 "aliases": [ 00:17:32.419 "8cec04be-476d-45f1-9001-57355c8119c5" 00:17:32.419 ], 00:17:32.419 "product_name": "Malloc disk", 00:17:32.419 "block_size": 512, 00:17:32.419 "num_blocks": 65536, 00:17:32.419 "uuid": "8cec04be-476d-45f1-9001-57355c8119c5", 00:17:32.419 "assigned_rate_limits": { 00:17:32.419 "rw_ios_per_sec": 0, 00:17:32.419 "rw_mbytes_per_sec": 0, 00:17:32.419 "r_mbytes_per_sec": 0, 00:17:32.419 "w_mbytes_per_sec": 0 00:17:32.419 }, 00:17:32.419 "claimed": true, 00:17:32.419 "claim_type": "exclusive_write", 00:17:32.419 "zoned": false, 00:17:32.419 "supported_io_types": { 00:17:32.419 "read": true, 00:17:32.419 "write": true, 00:17:32.419 "unmap": true, 00:17:32.419 "write_zeroes": true, 00:17:32.419 "flush": true, 00:17:32.419 "reset": true, 00:17:32.419 "compare": false, 00:17:32.419 "compare_and_write": false, 00:17:32.419 "abort": true, 00:17:32.419 "nvme_admin": false, 00:17:32.419 "nvme_io": false 00:17:32.419 }, 00:17:32.419 "memory_domains": [ 00:17:32.419 { 00:17:32.419 "dma_device_id": "system", 00:17:32.419 "dma_device_type": 1 00:17:32.419 }, 00:17:32.419 { 00:17:32.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.419 "dma_device_type": 2 00:17:32.419 } 00:17:32.419 ], 00:17:32.419 "driver_specific": {} 00:17:32.419 }' 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.419 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.678 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.678 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.678 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:32.678 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:32.678 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:32.937 [2024-05-14 11:53:59.841215] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:32.937 [2024-05-14 11:53:59.841242] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:32.937 [2024-05-14 11:53:59.841298] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:32.937 [2024-05-14 11:53:59.841363] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:32.937 [2024-05-14 11:53:59.841375] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2745920 name Existed_Raid, state offline 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1729235 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1729235 ']' 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1729235 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1729235 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1729235' 00:17:32.937 killing process with pid 1729235 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1729235 00:17:32.937 [2024-05-14 11:53:59.914319] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:32.937 11:53:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1729235 00:17:32.937 [2024-05-14 11:53:59.952419] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:33.196 11:54:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:17:33.196 00:17:33.196 real 0m31.096s 00:17:33.196 user 0m56.945s 00:17:33.196 sys 0m5.597s 00:17:33.196 11:54:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:33.196 11:54:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:33.196 ************************************ 00:17:33.196 END TEST raid_state_function_test_sb 00:17:33.196 ************************************ 00:17:33.196 11:54:00 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:17:33.196 11:54:00 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:17:33.196 11:54:00 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:33.196 11:54:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:33.196 ************************************ 00:17:33.196 START TEST raid_superblock_test 00:17:33.196 ************************************ 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 4 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1733976 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1733976 /var/tmp/spdk-raid.sock 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1733976 ']' 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:33.196 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:33.196 11:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.455 [2024-05-14 11:54:00.322670] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:17:33.455 [2024-05-14 11:54:00.322734] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1733976 ] 00:17:33.455 [2024-05-14 11:54:00.454184] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:33.715 [2024-05-14 11:54:00.566248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:33.715 [2024-05-14 11:54:00.628616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:33.715 [2024-05-14 11:54:00.628642] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:34.283 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:34.543 malloc1 00:17:34.543 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:34.802 [2024-05-14 11:54:01.747088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:34.802 [2024-05-14 11:54:01.747136] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.802 [2024-05-14 11:54:01.747160] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f092a0 00:17:34.802 [2024-05-14 11:54:01.747172] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.802 [2024-05-14 11:54:01.748957] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.802 [2024-05-14 11:54:01.748986] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:34.802 pt1 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:34.802 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:34.803 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:34.803 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:34.803 11:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:35.061 malloc2 00:17:35.061 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:35.321 [2024-05-14 11:54:02.238500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:35.321 [2024-05-14 11:54:02.238547] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:35.321 [2024-05-14 11:54:02.238570] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20bc480 00:17:35.321 [2024-05-14 11:54:02.238582] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:35.321 [2024-05-14 11:54:02.240184] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:35.321 [2024-05-14 11:54:02.240212] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:35.321 pt2 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:35.321 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:35.579 malloc3 00:17:35.579 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:35.838 [2024-05-14 11:54:02.732382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:35.838 [2024-05-14 11:54:02.732436] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:35.838 [2024-05-14 11:54:02.732455] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f02e80 00:17:35.838 [2024-05-14 11:54:02.732468] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:35.838 [2024-05-14 11:54:02.734008] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:35.838 [2024-05-14 11:54:02.734037] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:35.838 pt3 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:35.838 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:36.098 malloc4 00:17:36.098 11:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:36.357 [2024-05-14 11:54:03.218302] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:36.357 [2024-05-14 11:54:03.218349] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.357 [2024-05-14 11:54:03.218372] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f05490 00:17:36.357 [2024-05-14 11:54:03.218384] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.357 [2024-05-14 11:54:03.219971] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.357 [2024-05-14 11:54:03.220001] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:36.357 pt4 00:17:36.357 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:36.357 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:36.357 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:36.617 [2024-05-14 11:54:03.459152] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:36.617 [2024-05-14 11:54:03.460528] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:36.617 [2024-05-14 11:54:03.460583] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:36.617 [2024-05-14 11:54:03.460627] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:36.617 [2024-05-14 11:54:03.460815] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f067a0 00:17:36.617 [2024-05-14 11:54:03.460827] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:36.617 [2024-05-14 11:54:03.461031] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f06770 00:17:36.617 [2024-05-14 11:54:03.461185] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f067a0 00:17:36.617 [2024-05-14 11:54:03.461195] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f067a0 00:17:36.617 [2024-05-14 11:54:03.461304] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:36.617 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.876 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:36.876 "name": "raid_bdev1", 00:17:36.876 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:36.876 "strip_size_kb": 64, 00:17:36.876 "state": "online", 00:17:36.876 "raid_level": "concat", 00:17:36.876 "superblock": true, 00:17:36.876 "num_base_bdevs": 4, 00:17:36.876 "num_base_bdevs_discovered": 4, 00:17:36.876 "num_base_bdevs_operational": 4, 00:17:36.876 "base_bdevs_list": [ 00:17:36.876 { 00:17:36.876 "name": "pt1", 00:17:36.876 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:36.876 "is_configured": true, 00:17:36.876 "data_offset": 2048, 00:17:36.876 "data_size": 63488 00:17:36.876 }, 00:17:36.876 { 00:17:36.876 "name": "pt2", 00:17:36.876 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:36.876 "is_configured": true, 00:17:36.876 "data_offset": 2048, 00:17:36.876 "data_size": 63488 00:17:36.876 }, 00:17:36.876 { 00:17:36.876 "name": "pt3", 00:17:36.876 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:36.876 "is_configured": true, 00:17:36.876 "data_offset": 2048, 00:17:36.876 "data_size": 63488 00:17:36.876 }, 00:17:36.876 { 00:17:36.876 "name": "pt4", 00:17:36.876 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:36.876 "is_configured": true, 00:17:36.876 "data_offset": 2048, 00:17:36.876 "data_size": 63488 00:17:36.876 } 00:17:36.876 ] 00:17:36.876 }' 00:17:36.876 11:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:36.876 11:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:37.445 [2024-05-14 11:54:04.470068] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:37.445 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:37.445 "name": "raid_bdev1", 00:17:37.445 "aliases": [ 00:17:37.445 "b6174f77-758a-4623-b7f4-1d77a1e8bcf3" 00:17:37.445 ], 00:17:37.445 "product_name": "Raid Volume", 00:17:37.445 "block_size": 512, 00:17:37.445 "num_blocks": 253952, 00:17:37.445 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:37.445 "assigned_rate_limits": { 00:17:37.445 "rw_ios_per_sec": 0, 00:17:37.445 "rw_mbytes_per_sec": 0, 00:17:37.445 "r_mbytes_per_sec": 0, 00:17:37.445 "w_mbytes_per_sec": 0 00:17:37.445 }, 00:17:37.445 "claimed": false, 00:17:37.445 "zoned": false, 00:17:37.445 "supported_io_types": { 00:17:37.445 "read": true, 00:17:37.445 "write": true, 00:17:37.445 "unmap": true, 00:17:37.445 "write_zeroes": true, 00:17:37.445 "flush": true, 00:17:37.445 "reset": true, 00:17:37.445 "compare": false, 00:17:37.445 "compare_and_write": false, 00:17:37.445 "abort": false, 00:17:37.445 "nvme_admin": false, 00:17:37.445 "nvme_io": false 00:17:37.445 }, 00:17:37.445 "memory_domains": [ 00:17:37.445 { 00:17:37.445 "dma_device_id": "system", 00:17:37.445 "dma_device_type": 1 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.445 "dma_device_type": 2 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "system", 00:17:37.445 "dma_device_type": 1 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.445 "dma_device_type": 2 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "system", 00:17:37.445 "dma_device_type": 1 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.445 "dma_device_type": 2 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "system", 00:17:37.445 "dma_device_type": 1 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.445 "dma_device_type": 2 00:17:37.445 } 00:17:37.445 ], 00:17:37.445 "driver_specific": { 00:17:37.445 "raid": { 00:17:37.445 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:37.445 "strip_size_kb": 64, 00:17:37.445 "state": "online", 00:17:37.445 "raid_level": "concat", 00:17:37.445 "superblock": true, 00:17:37.445 "num_base_bdevs": 4, 00:17:37.445 "num_base_bdevs_discovered": 4, 00:17:37.445 "num_base_bdevs_operational": 4, 00:17:37.445 "base_bdevs_list": [ 00:17:37.445 { 00:17:37.445 "name": "pt1", 00:17:37.445 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:37.445 "is_configured": true, 00:17:37.445 "data_offset": 2048, 00:17:37.445 "data_size": 63488 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "name": "pt2", 00:17:37.445 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:37.445 "is_configured": true, 00:17:37.445 "data_offset": 2048, 00:17:37.445 "data_size": 63488 00:17:37.445 }, 00:17:37.445 { 00:17:37.445 "name": "pt3", 00:17:37.445 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:37.446 "is_configured": true, 00:17:37.446 "data_offset": 2048, 00:17:37.446 "data_size": 63488 00:17:37.446 }, 00:17:37.446 { 00:17:37.446 "name": "pt4", 00:17:37.446 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:37.446 "is_configured": true, 00:17:37.446 "data_offset": 2048, 00:17:37.446 "data_size": 63488 00:17:37.446 } 00:17:37.446 ] 00:17:37.446 } 00:17:37.446 } 00:17:37.446 }' 00:17:37.446 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:17:37.705 pt2 00:17:37.705 pt3 00:17:37.705 pt4' 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:37.705 "name": "pt1", 00:17:37.705 "aliases": [ 00:17:37.705 "83954a3e-dbdb-55d7-ba37-e1c367fa958b" 00:17:37.705 ], 00:17:37.705 "product_name": "passthru", 00:17:37.705 "block_size": 512, 00:17:37.705 "num_blocks": 65536, 00:17:37.705 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:37.705 "assigned_rate_limits": { 00:17:37.705 "rw_ios_per_sec": 0, 00:17:37.705 "rw_mbytes_per_sec": 0, 00:17:37.705 "r_mbytes_per_sec": 0, 00:17:37.705 "w_mbytes_per_sec": 0 00:17:37.705 }, 00:17:37.705 "claimed": true, 00:17:37.705 "claim_type": "exclusive_write", 00:17:37.705 "zoned": false, 00:17:37.705 "supported_io_types": { 00:17:37.705 "read": true, 00:17:37.705 "write": true, 00:17:37.705 "unmap": true, 00:17:37.705 "write_zeroes": true, 00:17:37.705 "flush": true, 00:17:37.705 "reset": true, 00:17:37.705 "compare": false, 00:17:37.705 "compare_and_write": false, 00:17:37.705 "abort": true, 00:17:37.705 "nvme_admin": false, 00:17:37.705 "nvme_io": false 00:17:37.705 }, 00:17:37.705 "memory_domains": [ 00:17:37.705 { 00:17:37.705 "dma_device_id": "system", 00:17:37.705 "dma_device_type": 1 00:17:37.705 }, 00:17:37.705 { 00:17:37.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.705 "dma_device_type": 2 00:17:37.705 } 00:17:37.705 ], 00:17:37.705 "driver_specific": { 00:17:37.705 "passthru": { 00:17:37.705 "name": "pt1", 00:17:37.705 "base_bdev_name": "malloc1" 00:17:37.705 } 00:17:37.705 } 00:17:37.705 }' 00:17:37.705 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:37.965 11:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:37.965 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:37.965 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.965 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:38.224 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:38.224 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:38.224 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:38.224 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:38.224 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:38.483 "name": "pt2", 00:17:38.483 "aliases": [ 00:17:38.483 "5aac5034-fa83-546f-ba31-319ff2841547" 00:17:38.483 ], 00:17:38.483 "product_name": "passthru", 00:17:38.483 "block_size": 512, 00:17:38.483 "num_blocks": 65536, 00:17:38.483 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:38.483 "assigned_rate_limits": { 00:17:38.483 "rw_ios_per_sec": 0, 00:17:38.483 "rw_mbytes_per_sec": 0, 00:17:38.483 "r_mbytes_per_sec": 0, 00:17:38.483 "w_mbytes_per_sec": 0 00:17:38.483 }, 00:17:38.483 "claimed": true, 00:17:38.483 "claim_type": "exclusive_write", 00:17:38.483 "zoned": false, 00:17:38.483 "supported_io_types": { 00:17:38.483 "read": true, 00:17:38.483 "write": true, 00:17:38.483 "unmap": true, 00:17:38.483 "write_zeroes": true, 00:17:38.483 "flush": true, 00:17:38.483 "reset": true, 00:17:38.483 "compare": false, 00:17:38.483 "compare_and_write": false, 00:17:38.483 "abort": true, 00:17:38.483 "nvme_admin": false, 00:17:38.483 "nvme_io": false 00:17:38.483 }, 00:17:38.483 "memory_domains": [ 00:17:38.483 { 00:17:38.483 "dma_device_id": "system", 00:17:38.483 "dma_device_type": 1 00:17:38.483 }, 00:17:38.483 { 00:17:38.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.483 "dma_device_type": 2 00:17:38.483 } 00:17:38.483 ], 00:17:38.483 "driver_specific": { 00:17:38.483 "passthru": { 00:17:38.483 "name": "pt2", 00:17:38.483 "base_bdev_name": "malloc2" 00:17:38.483 } 00:17:38.483 } 00:17:38.483 }' 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:38.483 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:38.743 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:39.002 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:39.002 "name": "pt3", 00:17:39.002 "aliases": [ 00:17:39.002 "ba20ca36-9188-586b-a067-cc53580d004e" 00:17:39.002 ], 00:17:39.002 "product_name": "passthru", 00:17:39.002 "block_size": 512, 00:17:39.002 "num_blocks": 65536, 00:17:39.002 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:39.002 "assigned_rate_limits": { 00:17:39.002 "rw_ios_per_sec": 0, 00:17:39.002 "rw_mbytes_per_sec": 0, 00:17:39.002 "r_mbytes_per_sec": 0, 00:17:39.002 "w_mbytes_per_sec": 0 00:17:39.002 }, 00:17:39.002 "claimed": true, 00:17:39.002 "claim_type": "exclusive_write", 00:17:39.002 "zoned": false, 00:17:39.002 "supported_io_types": { 00:17:39.002 "read": true, 00:17:39.002 "write": true, 00:17:39.002 "unmap": true, 00:17:39.002 "write_zeroes": true, 00:17:39.002 "flush": true, 00:17:39.002 "reset": true, 00:17:39.002 "compare": false, 00:17:39.002 "compare_and_write": false, 00:17:39.002 "abort": true, 00:17:39.002 "nvme_admin": false, 00:17:39.002 "nvme_io": false 00:17:39.002 }, 00:17:39.002 "memory_domains": [ 00:17:39.002 { 00:17:39.002 "dma_device_id": "system", 00:17:39.002 "dma_device_type": 1 00:17:39.002 }, 00:17:39.002 { 00:17:39.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.002 "dma_device_type": 2 00:17:39.002 } 00:17:39.002 ], 00:17:39.002 "driver_specific": { 00:17:39.002 "passthru": { 00:17:39.002 "name": "pt3", 00:17:39.002 "base_bdev_name": "malloc3" 00:17:39.002 } 00:17:39.002 } 00:17:39.002 }' 00:17:39.002 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.002 11:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.002 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:39.002 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.002 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:39.261 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:39.520 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:39.520 "name": "pt4", 00:17:39.520 "aliases": [ 00:17:39.520 "68f19e4d-0d58-51c1-ad84-330a6bf32bfb" 00:17:39.520 ], 00:17:39.520 "product_name": "passthru", 00:17:39.520 "block_size": 512, 00:17:39.520 "num_blocks": 65536, 00:17:39.520 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:39.520 "assigned_rate_limits": { 00:17:39.520 "rw_ios_per_sec": 0, 00:17:39.520 "rw_mbytes_per_sec": 0, 00:17:39.520 "r_mbytes_per_sec": 0, 00:17:39.520 "w_mbytes_per_sec": 0 00:17:39.520 }, 00:17:39.520 "claimed": true, 00:17:39.520 "claim_type": "exclusive_write", 00:17:39.520 "zoned": false, 00:17:39.520 "supported_io_types": { 00:17:39.520 "read": true, 00:17:39.520 "write": true, 00:17:39.520 "unmap": true, 00:17:39.520 "write_zeroes": true, 00:17:39.520 "flush": true, 00:17:39.520 "reset": true, 00:17:39.520 "compare": false, 00:17:39.520 "compare_and_write": false, 00:17:39.520 "abort": true, 00:17:39.520 "nvme_admin": false, 00:17:39.520 "nvme_io": false 00:17:39.520 }, 00:17:39.520 "memory_domains": [ 00:17:39.520 { 00:17:39.520 "dma_device_id": "system", 00:17:39.520 "dma_device_type": 1 00:17:39.520 }, 00:17:39.520 { 00:17:39.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.520 "dma_device_type": 2 00:17:39.520 } 00:17:39.520 ], 00:17:39.520 "driver_specific": { 00:17:39.520 "passthru": { 00:17:39.520 "name": "pt4", 00:17:39.520 "base_bdev_name": "malloc4" 00:17:39.520 } 00:17:39.520 } 00:17:39.520 }' 00:17:39.520 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.520 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.779 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:40.038 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:40.038 11:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:17:40.038 [2024-05-14 11:54:07.089026] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:40.038 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=b6174f77-758a-4623-b7f4-1d77a1e8bcf3 00:17:40.038 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z b6174f77-758a-4623-b7f4-1d77a1e8bcf3 ']' 00:17:40.038 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:40.298 [2024-05-14 11:54:07.333415] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:40.298 [2024-05-14 11:54:07.333440] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:40.298 [2024-05-14 11:54:07.333493] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:40.298 [2024-05-14 11:54:07.333566] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:40.298 [2024-05-14 11:54:07.333579] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f067a0 name raid_bdev1, state offline 00:17:40.298 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:17:40.298 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.623 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:17:40.623 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:17:40.623 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:40.624 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:40.882 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:40.882 11:54:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:41.141 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:41.141 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:41.400 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:41.400 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:41.659 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:41.659 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:41.918 11:54:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:41.918 [2024-05-14 11:54:08.997748] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:41.918 [2024-05-14 11:54:08.999147] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:41.918 [2024-05-14 11:54:08.999191] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:41.918 [2024-05-14 11:54:08.999223] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:41.918 [2024-05-14 11:54:08.999273] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:41.918 [2024-05-14 11:54:08.999313] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:41.918 [2024-05-14 11:54:08.999337] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:41.918 [2024-05-14 11:54:08.999360] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:41.918 [2024-05-14 11:54:08.999379] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:41.918 [2024-05-14 11:54:08.999389] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f05160 name raid_bdev1, state configuring 00:17:41.918 request: 00:17:41.918 { 00:17:41.918 "name": "raid_bdev1", 00:17:41.918 "raid_level": "concat", 00:17:41.918 "base_bdevs": [ 00:17:41.918 "malloc1", 00:17:41.918 "malloc2", 00:17:41.918 "malloc3", 00:17:41.918 "malloc4" 00:17:41.918 ], 00:17:41.918 "superblock": false, 00:17:41.918 "strip_size_kb": 64, 00:17:41.918 "method": "bdev_raid_create", 00:17:41.918 "req_id": 1 00:17:41.918 } 00:17:41.918 Got JSON-RPC error response 00:17:41.918 response: 00:17:41.918 { 00:17:41.918 "code": -17, 00:17:41.918 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:41.918 } 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:17:42.178 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:42.437 [2024-05-14 11:54:09.398750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:42.437 [2024-05-14 11:54:09.398797] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:42.437 [2024-05-14 11:54:09.398817] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b2040 00:17:42.437 [2024-05-14 11:54:09.398830] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:42.437 [2024-05-14 11:54:09.400504] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:42.437 [2024-05-14 11:54:09.400531] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:42.437 [2024-05-14 11:54:09.400608] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:17:42.437 [2024-05-14 11:54:09.400635] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:42.437 pt1 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.437 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:42.697 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:42.697 "name": "raid_bdev1", 00:17:42.697 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:42.697 "strip_size_kb": 64, 00:17:42.697 "state": "configuring", 00:17:42.697 "raid_level": "concat", 00:17:42.697 "superblock": true, 00:17:42.697 "num_base_bdevs": 4, 00:17:42.697 "num_base_bdevs_discovered": 1, 00:17:42.697 "num_base_bdevs_operational": 4, 00:17:42.697 "base_bdevs_list": [ 00:17:42.698 { 00:17:42.698 "name": "pt1", 00:17:42.698 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:42.698 "is_configured": true, 00:17:42.698 "data_offset": 2048, 00:17:42.698 "data_size": 63488 00:17:42.698 }, 00:17:42.698 { 00:17:42.698 "name": null, 00:17:42.698 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:42.698 "is_configured": false, 00:17:42.698 "data_offset": 2048, 00:17:42.698 "data_size": 63488 00:17:42.698 }, 00:17:42.698 { 00:17:42.698 "name": null, 00:17:42.698 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:42.698 "is_configured": false, 00:17:42.698 "data_offset": 2048, 00:17:42.698 "data_size": 63488 00:17:42.698 }, 00:17:42.698 { 00:17:42.698 "name": null, 00:17:42.698 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:42.698 "is_configured": false, 00:17:42.698 "data_offset": 2048, 00:17:42.698 "data_size": 63488 00:17:42.698 } 00:17:42.698 ] 00:17:42.698 }' 00:17:42.698 11:54:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:42.698 11:54:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.265 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:17:43.265 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:43.524 [2024-05-14 11:54:10.369339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:43.524 [2024-05-14 11:54:10.369392] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.524 [2024-05-14 11:54:10.369420] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f06cf0 00:17:43.524 [2024-05-14 11:54:10.369433] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.524 [2024-05-14 11:54:10.369774] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.524 [2024-05-14 11:54:10.369791] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:43.524 [2024-05-14 11:54:10.369855] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:17:43.524 [2024-05-14 11:54:10.369876] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:43.524 pt2 00:17:43.524 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:43.524 [2024-05-14 11:54:10.609983] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:43.783 "name": "raid_bdev1", 00:17:43.783 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:43.783 "strip_size_kb": 64, 00:17:43.783 "state": "configuring", 00:17:43.783 "raid_level": "concat", 00:17:43.783 "superblock": true, 00:17:43.783 "num_base_bdevs": 4, 00:17:43.783 "num_base_bdevs_discovered": 1, 00:17:43.783 "num_base_bdevs_operational": 4, 00:17:43.783 "base_bdevs_list": [ 00:17:43.783 { 00:17:43.783 "name": "pt1", 00:17:43.783 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:43.783 "is_configured": true, 00:17:43.783 "data_offset": 2048, 00:17:43.783 "data_size": 63488 00:17:43.783 }, 00:17:43.783 { 00:17:43.783 "name": null, 00:17:43.783 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:43.783 "is_configured": false, 00:17:43.783 "data_offset": 2048, 00:17:43.783 "data_size": 63488 00:17:43.783 }, 00:17:43.783 { 00:17:43.783 "name": null, 00:17:43.783 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:43.783 "is_configured": false, 00:17:43.783 "data_offset": 2048, 00:17:43.783 "data_size": 63488 00:17:43.783 }, 00:17:43.783 { 00:17:43.783 "name": null, 00:17:43.783 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:43.783 "is_configured": false, 00:17:43.783 "data_offset": 2048, 00:17:43.783 "data_size": 63488 00:17:43.783 } 00:17:43.783 ] 00:17:43.783 }' 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:43.783 11:54:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:44.720 [2024-05-14 11:54:11.676797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:44.720 [2024-05-14 11:54:11.676849] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.720 [2024-05-14 11:54:11.676868] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f07020 00:17:44.720 [2024-05-14 11:54:11.676880] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.720 [2024-05-14 11:54:11.677216] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.720 [2024-05-14 11:54:11.677232] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:44.720 [2024-05-14 11:54:11.677296] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:17:44.720 [2024-05-14 11:54:11.677316] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:44.720 pt2 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:44.720 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:44.979 [2024-05-14 11:54:11.921449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:44.979 [2024-05-14 11:54:11.921488] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.979 [2024-05-14 11:54:11.921506] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f056c0 00:17:44.979 [2024-05-14 11:54:11.921518] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.979 [2024-05-14 11:54:11.921828] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.979 [2024-05-14 11:54:11.921845] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:44.979 [2024-05-14 11:54:11.921902] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:17:44.979 [2024-05-14 11:54:11.921921] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:44.979 pt3 00:17:44.979 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:44.979 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:44.979 11:54:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:45.239 [2024-05-14 11:54:12.097914] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:45.239 [2024-05-14 11:54:12.097945] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.239 [2024-05-14 11:54:12.097964] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f073f0 00:17:45.239 [2024-05-14 11:54:12.097976] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.239 [2024-05-14 11:54:12.098266] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.239 [2024-05-14 11:54:12.098282] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:45.239 [2024-05-14 11:54:12.098334] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:17:45.239 [2024-05-14 11:54:12.098353] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:45.239 [2024-05-14 11:54:12.098480] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f06310 00:17:45.239 [2024-05-14 11:54:12.098492] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:45.239 [2024-05-14 11:54:12.098664] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f05050 00:17:45.239 [2024-05-14 11:54:12.098795] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f06310 00:17:45.239 [2024-05-14 11:54:12.098804] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f06310 00:17:45.239 [2024-05-14 11:54:12.098898] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.239 pt4 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.239 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.498 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:45.498 "name": "raid_bdev1", 00:17:45.498 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:45.498 "strip_size_kb": 64, 00:17:45.498 "state": "online", 00:17:45.498 "raid_level": "concat", 00:17:45.498 "superblock": true, 00:17:45.498 "num_base_bdevs": 4, 00:17:45.498 "num_base_bdevs_discovered": 4, 00:17:45.498 "num_base_bdevs_operational": 4, 00:17:45.498 "base_bdevs_list": [ 00:17:45.498 { 00:17:45.498 "name": "pt1", 00:17:45.498 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:45.498 "is_configured": true, 00:17:45.498 "data_offset": 2048, 00:17:45.498 "data_size": 63488 00:17:45.498 }, 00:17:45.498 { 00:17:45.498 "name": "pt2", 00:17:45.498 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:45.498 "is_configured": true, 00:17:45.498 "data_offset": 2048, 00:17:45.498 "data_size": 63488 00:17:45.498 }, 00:17:45.498 { 00:17:45.498 "name": "pt3", 00:17:45.498 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:45.498 "is_configured": true, 00:17:45.498 "data_offset": 2048, 00:17:45.498 "data_size": 63488 00:17:45.498 }, 00:17:45.498 { 00:17:45.498 "name": "pt4", 00:17:45.498 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:45.498 "is_configured": true, 00:17:45.498 "data_offset": 2048, 00:17:45.498 "data_size": 63488 00:17:45.498 } 00:17:45.498 ] 00:17:45.498 }' 00:17:45.498 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:45.498 11:54:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:46.065 11:54:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:46.065 [2024-05-14 11:54:13.092897] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:46.065 "name": "raid_bdev1", 00:17:46.065 "aliases": [ 00:17:46.065 "b6174f77-758a-4623-b7f4-1d77a1e8bcf3" 00:17:46.065 ], 00:17:46.065 "product_name": "Raid Volume", 00:17:46.065 "block_size": 512, 00:17:46.065 "num_blocks": 253952, 00:17:46.065 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:46.065 "assigned_rate_limits": { 00:17:46.065 "rw_ios_per_sec": 0, 00:17:46.065 "rw_mbytes_per_sec": 0, 00:17:46.065 "r_mbytes_per_sec": 0, 00:17:46.065 "w_mbytes_per_sec": 0 00:17:46.065 }, 00:17:46.065 "claimed": false, 00:17:46.065 "zoned": false, 00:17:46.065 "supported_io_types": { 00:17:46.065 "read": true, 00:17:46.065 "write": true, 00:17:46.065 "unmap": true, 00:17:46.065 "write_zeroes": true, 00:17:46.065 "flush": true, 00:17:46.065 "reset": true, 00:17:46.065 "compare": false, 00:17:46.065 "compare_and_write": false, 00:17:46.065 "abort": false, 00:17:46.065 "nvme_admin": false, 00:17:46.065 "nvme_io": false 00:17:46.065 }, 00:17:46.065 "memory_domains": [ 00:17:46.065 { 00:17:46.065 "dma_device_id": "system", 00:17:46.065 "dma_device_type": 1 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.065 "dma_device_type": 2 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "system", 00:17:46.065 "dma_device_type": 1 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.065 "dma_device_type": 2 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "system", 00:17:46.065 "dma_device_type": 1 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.065 "dma_device_type": 2 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "system", 00:17:46.065 "dma_device_type": 1 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.065 "dma_device_type": 2 00:17:46.065 } 00:17:46.065 ], 00:17:46.065 "driver_specific": { 00:17:46.065 "raid": { 00:17:46.065 "uuid": "b6174f77-758a-4623-b7f4-1d77a1e8bcf3", 00:17:46.065 "strip_size_kb": 64, 00:17:46.065 "state": "online", 00:17:46.065 "raid_level": "concat", 00:17:46.065 "superblock": true, 00:17:46.065 "num_base_bdevs": 4, 00:17:46.065 "num_base_bdevs_discovered": 4, 00:17:46.065 "num_base_bdevs_operational": 4, 00:17:46.065 "base_bdevs_list": [ 00:17:46.065 { 00:17:46.065 "name": "pt1", 00:17:46.065 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:46.065 "is_configured": true, 00:17:46.065 "data_offset": 2048, 00:17:46.065 "data_size": 63488 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "name": "pt2", 00:17:46.065 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:46.065 "is_configured": true, 00:17:46.065 "data_offset": 2048, 00:17:46.065 "data_size": 63488 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "name": "pt3", 00:17:46.065 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:46.065 "is_configured": true, 00:17:46.065 "data_offset": 2048, 00:17:46.065 "data_size": 63488 00:17:46.065 }, 00:17:46.065 { 00:17:46.065 "name": "pt4", 00:17:46.065 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:46.065 "is_configured": true, 00:17:46.065 "data_offset": 2048, 00:17:46.065 "data_size": 63488 00:17:46.065 } 00:17:46.065 ] 00:17:46.065 } 00:17:46.065 } 00:17:46.065 }' 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:17:46.065 pt2 00:17:46.065 pt3 00:17:46.065 pt4' 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:46.065 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:46.324 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:46.324 "name": "pt1", 00:17:46.324 "aliases": [ 00:17:46.324 "83954a3e-dbdb-55d7-ba37-e1c367fa958b" 00:17:46.324 ], 00:17:46.324 "product_name": "passthru", 00:17:46.324 "block_size": 512, 00:17:46.324 "num_blocks": 65536, 00:17:46.324 "uuid": "83954a3e-dbdb-55d7-ba37-e1c367fa958b", 00:17:46.324 "assigned_rate_limits": { 00:17:46.324 "rw_ios_per_sec": 0, 00:17:46.324 "rw_mbytes_per_sec": 0, 00:17:46.324 "r_mbytes_per_sec": 0, 00:17:46.324 "w_mbytes_per_sec": 0 00:17:46.324 }, 00:17:46.324 "claimed": true, 00:17:46.324 "claim_type": "exclusive_write", 00:17:46.324 "zoned": false, 00:17:46.324 "supported_io_types": { 00:17:46.324 "read": true, 00:17:46.324 "write": true, 00:17:46.324 "unmap": true, 00:17:46.324 "write_zeroes": true, 00:17:46.324 "flush": true, 00:17:46.324 "reset": true, 00:17:46.324 "compare": false, 00:17:46.324 "compare_and_write": false, 00:17:46.324 "abort": true, 00:17:46.324 "nvme_admin": false, 00:17:46.324 "nvme_io": false 00:17:46.324 }, 00:17:46.324 "memory_domains": [ 00:17:46.324 { 00:17:46.324 "dma_device_id": "system", 00:17:46.324 "dma_device_type": 1 00:17:46.324 }, 00:17:46.324 { 00:17:46.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.324 "dma_device_type": 2 00:17:46.324 } 00:17:46.324 ], 00:17:46.324 "driver_specific": { 00:17:46.324 "passthru": { 00:17:46.324 "name": "pt1", 00:17:46.324 "base_bdev_name": "malloc1" 00:17:46.324 } 00:17:46.324 } 00:17:46.324 }' 00:17:46.324 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.583 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:46.843 "name": "pt2", 00:17:46.843 "aliases": [ 00:17:46.843 "5aac5034-fa83-546f-ba31-319ff2841547" 00:17:46.843 ], 00:17:46.843 "product_name": "passthru", 00:17:46.843 "block_size": 512, 00:17:46.843 "num_blocks": 65536, 00:17:46.843 "uuid": "5aac5034-fa83-546f-ba31-319ff2841547", 00:17:46.843 "assigned_rate_limits": { 00:17:46.843 "rw_ios_per_sec": 0, 00:17:46.843 "rw_mbytes_per_sec": 0, 00:17:46.843 "r_mbytes_per_sec": 0, 00:17:46.843 "w_mbytes_per_sec": 0 00:17:46.843 }, 00:17:46.843 "claimed": true, 00:17:46.843 "claim_type": "exclusive_write", 00:17:46.843 "zoned": false, 00:17:46.843 "supported_io_types": { 00:17:46.843 "read": true, 00:17:46.843 "write": true, 00:17:46.843 "unmap": true, 00:17:46.843 "write_zeroes": true, 00:17:46.843 "flush": true, 00:17:46.843 "reset": true, 00:17:46.843 "compare": false, 00:17:46.843 "compare_and_write": false, 00:17:46.843 "abort": true, 00:17:46.843 "nvme_admin": false, 00:17:46.843 "nvme_io": false 00:17:46.843 }, 00:17:46.843 "memory_domains": [ 00:17:46.843 { 00:17:46.843 "dma_device_id": "system", 00:17:46.843 "dma_device_type": 1 00:17:46.843 }, 00:17:46.843 { 00:17:46.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.843 "dma_device_type": 2 00:17:46.843 } 00:17:46.843 ], 00:17:46.843 "driver_specific": { 00:17:46.843 "passthru": { 00:17:46.843 "name": "pt2", 00:17:46.843 "base_bdev_name": "malloc2" 00:17:46.843 } 00:17:46.843 } 00:17:46.843 }' 00:17:46.843 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.102 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.102 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:47.102 11:54:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.102 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.361 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.361 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:47.361 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:47.361 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:47.361 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:47.620 "name": "pt3", 00:17:47.620 "aliases": [ 00:17:47.620 "ba20ca36-9188-586b-a067-cc53580d004e" 00:17:47.620 ], 00:17:47.620 "product_name": "passthru", 00:17:47.620 "block_size": 512, 00:17:47.620 "num_blocks": 65536, 00:17:47.620 "uuid": "ba20ca36-9188-586b-a067-cc53580d004e", 00:17:47.620 "assigned_rate_limits": { 00:17:47.620 "rw_ios_per_sec": 0, 00:17:47.620 "rw_mbytes_per_sec": 0, 00:17:47.620 "r_mbytes_per_sec": 0, 00:17:47.620 "w_mbytes_per_sec": 0 00:17:47.620 }, 00:17:47.620 "claimed": true, 00:17:47.620 "claim_type": "exclusive_write", 00:17:47.620 "zoned": false, 00:17:47.620 "supported_io_types": { 00:17:47.620 "read": true, 00:17:47.620 "write": true, 00:17:47.620 "unmap": true, 00:17:47.620 "write_zeroes": true, 00:17:47.620 "flush": true, 00:17:47.620 "reset": true, 00:17:47.620 "compare": false, 00:17:47.620 "compare_and_write": false, 00:17:47.620 "abort": true, 00:17:47.620 "nvme_admin": false, 00:17:47.620 "nvme_io": false 00:17:47.620 }, 00:17:47.620 "memory_domains": [ 00:17:47.620 { 00:17:47.620 "dma_device_id": "system", 00:17:47.620 "dma_device_type": 1 00:17:47.620 }, 00:17:47.620 { 00:17:47.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.620 "dma_device_type": 2 00:17:47.620 } 00:17:47.620 ], 00:17:47.620 "driver_specific": { 00:17:47.620 "passthru": { 00:17:47.620 "name": "pt3", 00:17:47.620 "base_bdev_name": "malloc3" 00:17:47.620 } 00:17:47.620 } 00:17:47.620 }' 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.620 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:47.879 11:54:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:48.139 "name": "pt4", 00:17:48.139 "aliases": [ 00:17:48.139 "68f19e4d-0d58-51c1-ad84-330a6bf32bfb" 00:17:48.139 ], 00:17:48.139 "product_name": "passthru", 00:17:48.139 "block_size": 512, 00:17:48.139 "num_blocks": 65536, 00:17:48.139 "uuid": "68f19e4d-0d58-51c1-ad84-330a6bf32bfb", 00:17:48.139 "assigned_rate_limits": { 00:17:48.139 "rw_ios_per_sec": 0, 00:17:48.139 "rw_mbytes_per_sec": 0, 00:17:48.139 "r_mbytes_per_sec": 0, 00:17:48.139 "w_mbytes_per_sec": 0 00:17:48.139 }, 00:17:48.139 "claimed": true, 00:17:48.139 "claim_type": "exclusive_write", 00:17:48.139 "zoned": false, 00:17:48.139 "supported_io_types": { 00:17:48.139 "read": true, 00:17:48.139 "write": true, 00:17:48.139 "unmap": true, 00:17:48.139 "write_zeroes": true, 00:17:48.139 "flush": true, 00:17:48.139 "reset": true, 00:17:48.139 "compare": false, 00:17:48.139 "compare_and_write": false, 00:17:48.139 "abort": true, 00:17:48.139 "nvme_admin": false, 00:17:48.139 "nvme_io": false 00:17:48.139 }, 00:17:48.139 "memory_domains": [ 00:17:48.139 { 00:17:48.139 "dma_device_id": "system", 00:17:48.139 "dma_device_type": 1 00:17:48.139 }, 00:17:48.139 { 00:17:48.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.139 "dma_device_type": 2 00:17:48.139 } 00:17:48.139 ], 00:17:48.139 "driver_specific": { 00:17:48.139 "passthru": { 00:17:48.139 "name": "pt4", 00:17:48.139 "base_bdev_name": "malloc4" 00:17:48.139 } 00:17:48.139 } 00:17:48.139 }' 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.139 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:48.398 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:17:48.657 [2024-05-14 11:54:15.547431] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' b6174f77-758a-4623-b7f4-1d77a1e8bcf3 '!=' b6174f77-758a-4623-b7f4-1d77a1e8bcf3 ']' 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1733976 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1733976 ']' 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1733976 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1733976 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:48.657 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1733976' 00:17:48.657 killing process with pid 1733976 00:17:48.658 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1733976 00:17:48.658 [2024-05-14 11:54:15.619976] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:48.658 [2024-05-14 11:54:15.620044] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.658 [2024-05-14 11:54:15.620109] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:48.658 [2024-05-14 11:54:15.620123] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f06310 name raid_bdev1, state offline 00:17:48.658 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1733976 00:17:48.658 [2024-05-14 11:54:15.658205] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:48.918 11:54:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:17:48.918 00:17:48.918 real 0m15.616s 00:17:48.918 user 0m28.108s 00:17:48.918 sys 0m2.801s 00:17:48.918 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:48.918 11:54:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.918 ************************************ 00:17:48.918 END TEST raid_superblock_test 00:17:48.918 ************************************ 00:17:48.918 11:54:15 bdev_raid -- bdev/bdev_raid.sh@814 -- # for level in raid0 concat raid1 00:17:48.918 11:54:15 bdev_raid -- bdev/bdev_raid.sh@815 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:17:48.918 11:54:15 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:17:48.918 11:54:15 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:48.918 11:54:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:48.918 ************************************ 00:17:48.918 START TEST raid_state_function_test 00:17:48.918 ************************************ 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 false 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=1736880 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1736880' 00:17:48.918 Process raid pid: 1736880 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 1736880 /var/tmp/spdk-raid.sock 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 1736880 ']' 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:48.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:48.918 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.177 [2024-05-14 11:54:16.041993] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:17:49.177 [2024-05-14 11:54:16.042064] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:49.177 [2024-05-14 11:54:16.172567] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:49.436 [2024-05-14 11:54:16.271516] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:49.436 [2024-05-14 11:54:16.334366] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:49.436 [2024-05-14 11:54:16.334410] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.005 11:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:50.005 11:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:17:50.005 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:50.005 [2024-05-14 11:54:17.080650] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:50.005 [2024-05-14 11:54:17.080693] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:50.005 [2024-05-14 11:54:17.080704] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:50.005 [2024-05-14 11:54:17.080716] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:50.005 [2024-05-14 11:54:17.080725] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:50.005 [2024-05-14 11:54:17.080736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:50.005 [2024-05-14 11:54:17.080745] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:50.005 [2024-05-14 11:54:17.080757] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.264 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:50.264 "name": "Existed_Raid", 00:17:50.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.264 "strip_size_kb": 0, 00:17:50.264 "state": "configuring", 00:17:50.264 "raid_level": "raid1", 00:17:50.264 "superblock": false, 00:17:50.264 "num_base_bdevs": 4, 00:17:50.264 "num_base_bdevs_discovered": 0, 00:17:50.264 "num_base_bdevs_operational": 4, 00:17:50.264 "base_bdevs_list": [ 00:17:50.264 { 00:17:50.264 "name": "BaseBdev1", 00:17:50.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.264 "is_configured": false, 00:17:50.264 "data_offset": 0, 00:17:50.264 "data_size": 0 00:17:50.265 }, 00:17:50.265 { 00:17:50.265 "name": "BaseBdev2", 00:17:50.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.265 "is_configured": false, 00:17:50.265 "data_offset": 0, 00:17:50.265 "data_size": 0 00:17:50.265 }, 00:17:50.265 { 00:17:50.265 "name": "BaseBdev3", 00:17:50.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.265 "is_configured": false, 00:17:50.265 "data_offset": 0, 00:17:50.265 "data_size": 0 00:17:50.265 }, 00:17:50.265 { 00:17:50.265 "name": "BaseBdev4", 00:17:50.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.265 "is_configured": false, 00:17:50.265 "data_offset": 0, 00:17:50.265 "data_size": 0 00:17:50.265 } 00:17:50.265 ] 00:17:50.265 }' 00:17:50.265 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:50.265 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.833 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:51.092 [2024-05-14 11:54:18.063136] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:51.092 [2024-05-14 11:54:18.063171] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a68720 name Existed_Raid, state configuring 00:17:51.092 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:51.351 [2024-05-14 11:54:18.303779] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:51.351 [2024-05-14 11:54:18.303810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:51.351 [2024-05-14 11:54:18.303820] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.351 [2024-05-14 11:54:18.303831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.351 [2024-05-14 11:54:18.303840] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.351 [2024-05-14 11:54:18.303851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:51.351 [2024-05-14 11:54:18.303860] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:51.351 [2024-05-14 11:54:18.303871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:51.351 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:51.610 [2024-05-14 11:54:18.559595] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:51.610 BaseBdev1 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:51.610 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:51.870 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.129 [ 00:17:52.129 { 00:17:52.129 "name": "BaseBdev1", 00:17:52.129 "aliases": [ 00:17:52.129 "6a0d17e7-d335-4623-b0ce-3c060215bd91" 00:17:52.129 ], 00:17:52.129 "product_name": "Malloc disk", 00:17:52.129 "block_size": 512, 00:17:52.129 "num_blocks": 65536, 00:17:52.129 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:52.129 "assigned_rate_limits": { 00:17:52.129 "rw_ios_per_sec": 0, 00:17:52.129 "rw_mbytes_per_sec": 0, 00:17:52.129 "r_mbytes_per_sec": 0, 00:17:52.130 "w_mbytes_per_sec": 0 00:17:52.130 }, 00:17:52.130 "claimed": true, 00:17:52.130 "claim_type": "exclusive_write", 00:17:52.130 "zoned": false, 00:17:52.130 "supported_io_types": { 00:17:52.130 "read": true, 00:17:52.130 "write": true, 00:17:52.130 "unmap": true, 00:17:52.130 "write_zeroes": true, 00:17:52.130 "flush": true, 00:17:52.130 "reset": true, 00:17:52.130 "compare": false, 00:17:52.130 "compare_and_write": false, 00:17:52.130 "abort": true, 00:17:52.130 "nvme_admin": false, 00:17:52.130 "nvme_io": false 00:17:52.130 }, 00:17:52.130 "memory_domains": [ 00:17:52.130 { 00:17:52.130 "dma_device_id": "system", 00:17:52.130 "dma_device_type": 1 00:17:52.130 }, 00:17:52.130 { 00:17:52.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.130 "dma_device_type": 2 00:17:52.130 } 00:17:52.130 ], 00:17:52.130 "driver_specific": {} 00:17:52.130 } 00:17:52.130 ] 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.130 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.389 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:52.389 "name": "Existed_Raid", 00:17:52.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.389 "strip_size_kb": 0, 00:17:52.389 "state": "configuring", 00:17:52.389 "raid_level": "raid1", 00:17:52.389 "superblock": false, 00:17:52.389 "num_base_bdevs": 4, 00:17:52.389 "num_base_bdevs_discovered": 1, 00:17:52.389 "num_base_bdevs_operational": 4, 00:17:52.389 "base_bdevs_list": [ 00:17:52.389 { 00:17:52.389 "name": "BaseBdev1", 00:17:52.389 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:52.389 "is_configured": true, 00:17:52.389 "data_offset": 0, 00:17:52.389 "data_size": 65536 00:17:52.389 }, 00:17:52.389 { 00:17:52.389 "name": "BaseBdev2", 00:17:52.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.389 "is_configured": false, 00:17:52.389 "data_offset": 0, 00:17:52.389 "data_size": 0 00:17:52.389 }, 00:17:52.389 { 00:17:52.389 "name": "BaseBdev3", 00:17:52.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.389 "is_configured": false, 00:17:52.389 "data_offset": 0, 00:17:52.389 "data_size": 0 00:17:52.389 }, 00:17:52.389 { 00:17:52.389 "name": "BaseBdev4", 00:17:52.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.389 "is_configured": false, 00:17:52.389 "data_offset": 0, 00:17:52.389 "data_size": 0 00:17:52.389 } 00:17:52.389 ] 00:17:52.389 }' 00:17:52.389 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:52.390 11:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.958 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:52.958 [2024-05-14 11:54:19.971311] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:52.958 [2024-05-14 11:54:19.971351] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a67fb0 name Existed_Raid, state configuring 00:17:52.958 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:53.217 [2024-05-14 11:54:20.155840] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:53.217 [2024-05-14 11:54:20.157353] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:53.217 [2024-05-14 11:54:20.157387] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:53.217 [2024-05-14 11:54:20.157407] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:53.217 [2024-05-14 11:54:20.157420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:53.217 [2024-05-14 11:54:20.157429] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:53.217 [2024-05-14 11:54:20.157440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.217 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.476 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:53.476 "name": "Existed_Raid", 00:17:53.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.476 "strip_size_kb": 0, 00:17:53.476 "state": "configuring", 00:17:53.476 "raid_level": "raid1", 00:17:53.476 "superblock": false, 00:17:53.476 "num_base_bdevs": 4, 00:17:53.476 "num_base_bdevs_discovered": 1, 00:17:53.476 "num_base_bdevs_operational": 4, 00:17:53.476 "base_bdevs_list": [ 00:17:53.476 { 00:17:53.476 "name": "BaseBdev1", 00:17:53.476 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:53.476 "is_configured": true, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 65536 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": "BaseBdev2", 00:17:53.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.476 "is_configured": false, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 0 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": "BaseBdev3", 00:17:53.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.476 "is_configured": false, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 0 00:17:53.476 }, 00:17:53.476 { 00:17:53.476 "name": "BaseBdev4", 00:17:53.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.476 "is_configured": false, 00:17:53.476 "data_offset": 0, 00:17:53.476 "data_size": 0 00:17:53.476 } 00:17:53.476 ] 00:17:53.476 }' 00:17:53.476 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:53.476 11:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.043 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:54.301 [2024-05-14 11:54:21.177989] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:54.301 BaseBdev2 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.301 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:54.593 [ 00:17:54.593 { 00:17:54.593 "name": "BaseBdev2", 00:17:54.593 "aliases": [ 00:17:54.593 "8ef88377-d171-496e-a15a-20f87738c456" 00:17:54.593 ], 00:17:54.593 "product_name": "Malloc disk", 00:17:54.593 "block_size": 512, 00:17:54.593 "num_blocks": 65536, 00:17:54.593 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:54.593 "assigned_rate_limits": { 00:17:54.593 "rw_ios_per_sec": 0, 00:17:54.593 "rw_mbytes_per_sec": 0, 00:17:54.593 "r_mbytes_per_sec": 0, 00:17:54.593 "w_mbytes_per_sec": 0 00:17:54.593 }, 00:17:54.593 "claimed": true, 00:17:54.593 "claim_type": "exclusive_write", 00:17:54.593 "zoned": false, 00:17:54.593 "supported_io_types": { 00:17:54.593 "read": true, 00:17:54.593 "write": true, 00:17:54.593 "unmap": true, 00:17:54.593 "write_zeroes": true, 00:17:54.593 "flush": true, 00:17:54.593 "reset": true, 00:17:54.593 "compare": false, 00:17:54.593 "compare_and_write": false, 00:17:54.593 "abort": true, 00:17:54.593 "nvme_admin": false, 00:17:54.593 "nvme_io": false 00:17:54.593 }, 00:17:54.593 "memory_domains": [ 00:17:54.593 { 00:17:54.593 "dma_device_id": "system", 00:17:54.593 "dma_device_type": 1 00:17:54.593 }, 00:17:54.593 { 00:17:54.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.593 "dma_device_type": 2 00:17:54.593 } 00:17:54.593 ], 00:17:54.593 "driver_specific": {} 00:17:54.593 } 00:17:54.593 ] 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.593 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.852 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:54.852 "name": "Existed_Raid", 00:17:54.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.852 "strip_size_kb": 0, 00:17:54.852 "state": "configuring", 00:17:54.852 "raid_level": "raid1", 00:17:54.852 "superblock": false, 00:17:54.852 "num_base_bdevs": 4, 00:17:54.852 "num_base_bdevs_discovered": 2, 00:17:54.852 "num_base_bdevs_operational": 4, 00:17:54.852 "base_bdevs_list": [ 00:17:54.852 { 00:17:54.852 "name": "BaseBdev1", 00:17:54.852 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:54.852 "is_configured": true, 00:17:54.852 "data_offset": 0, 00:17:54.852 "data_size": 65536 00:17:54.852 }, 00:17:54.852 { 00:17:54.852 "name": "BaseBdev2", 00:17:54.852 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:54.852 "is_configured": true, 00:17:54.852 "data_offset": 0, 00:17:54.852 "data_size": 65536 00:17:54.852 }, 00:17:54.852 { 00:17:54.852 "name": "BaseBdev3", 00:17:54.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.852 "is_configured": false, 00:17:54.852 "data_offset": 0, 00:17:54.852 "data_size": 0 00:17:54.852 }, 00:17:54.852 { 00:17:54.852 "name": "BaseBdev4", 00:17:54.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.852 "is_configured": false, 00:17:54.852 "data_offset": 0, 00:17:54.852 "data_size": 0 00:17:54.852 } 00:17:54.852 ] 00:17:54.852 }' 00:17:54.852 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:54.852 11:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.419 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:55.677 [2024-05-14 11:54:22.516906] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:55.677 BaseBdev3 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:55.677 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.935 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:55.935 [ 00:17:55.935 { 00:17:55.935 "name": "BaseBdev3", 00:17:55.935 "aliases": [ 00:17:55.935 "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311" 00:17:55.935 ], 00:17:55.935 "product_name": "Malloc disk", 00:17:55.935 "block_size": 512, 00:17:55.935 "num_blocks": 65536, 00:17:55.935 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:17:55.935 "assigned_rate_limits": { 00:17:55.935 "rw_ios_per_sec": 0, 00:17:55.935 "rw_mbytes_per_sec": 0, 00:17:55.935 "r_mbytes_per_sec": 0, 00:17:55.935 "w_mbytes_per_sec": 0 00:17:55.935 }, 00:17:55.935 "claimed": true, 00:17:55.935 "claim_type": "exclusive_write", 00:17:55.935 "zoned": false, 00:17:55.935 "supported_io_types": { 00:17:55.935 "read": true, 00:17:55.935 "write": true, 00:17:55.935 "unmap": true, 00:17:55.935 "write_zeroes": true, 00:17:55.935 "flush": true, 00:17:55.935 "reset": true, 00:17:55.935 "compare": false, 00:17:55.935 "compare_and_write": false, 00:17:55.935 "abort": true, 00:17:55.935 "nvme_admin": false, 00:17:55.935 "nvme_io": false 00:17:55.935 }, 00:17:55.935 "memory_domains": [ 00:17:55.935 { 00:17:55.935 "dma_device_id": "system", 00:17:55.935 "dma_device_type": 1 00:17:55.935 }, 00:17:55.935 { 00:17:55.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:55.935 "dma_device_type": 2 00:17:55.935 } 00:17:55.935 ], 00:17:55.935 "driver_specific": {} 00:17:55.935 } 00:17:55.935 ] 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:56.194 "name": "Existed_Raid", 00:17:56.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.194 "strip_size_kb": 0, 00:17:56.194 "state": "configuring", 00:17:56.194 "raid_level": "raid1", 00:17:56.194 "superblock": false, 00:17:56.194 "num_base_bdevs": 4, 00:17:56.194 "num_base_bdevs_discovered": 3, 00:17:56.194 "num_base_bdevs_operational": 4, 00:17:56.194 "base_bdevs_list": [ 00:17:56.194 { 00:17:56.194 "name": "BaseBdev1", 00:17:56.194 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:56.194 "is_configured": true, 00:17:56.194 "data_offset": 0, 00:17:56.194 "data_size": 65536 00:17:56.194 }, 00:17:56.194 { 00:17:56.194 "name": "BaseBdev2", 00:17:56.194 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:56.194 "is_configured": true, 00:17:56.194 "data_offset": 0, 00:17:56.194 "data_size": 65536 00:17:56.194 }, 00:17:56.194 { 00:17:56.194 "name": "BaseBdev3", 00:17:56.194 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:17:56.194 "is_configured": true, 00:17:56.194 "data_offset": 0, 00:17:56.194 "data_size": 65536 00:17:56.194 }, 00:17:56.194 { 00:17:56.194 "name": "BaseBdev4", 00:17:56.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.194 "is_configured": false, 00:17:56.194 "data_offset": 0, 00:17:56.194 "data_size": 0 00:17:56.194 } 00:17:56.194 ] 00:17:56.194 }' 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:56.194 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.128 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:57.128 [2024-05-14 11:54:24.084515] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:57.128 [2024-05-14 11:54:24.084556] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a691b0 00:17:57.128 [2024-05-14 11:54:24.084564] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:57.128 [2024-05-14 11:54:24.084771] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6a860 00:17:57.128 [2024-05-14 11:54:24.084908] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a691b0 00:17:57.128 [2024-05-14 11:54:24.084918] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a691b0 00:17:57.128 [2024-05-14 11:54:24.085093] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.128 BaseBdev4 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:57.128 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.387 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:57.646 [ 00:17:57.646 { 00:17:57.646 "name": "BaseBdev4", 00:17:57.646 "aliases": [ 00:17:57.646 "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d" 00:17:57.646 ], 00:17:57.646 "product_name": "Malloc disk", 00:17:57.646 "block_size": 512, 00:17:57.646 "num_blocks": 65536, 00:17:57.646 "uuid": "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d", 00:17:57.646 "assigned_rate_limits": { 00:17:57.646 "rw_ios_per_sec": 0, 00:17:57.646 "rw_mbytes_per_sec": 0, 00:17:57.646 "r_mbytes_per_sec": 0, 00:17:57.646 "w_mbytes_per_sec": 0 00:17:57.646 }, 00:17:57.646 "claimed": true, 00:17:57.646 "claim_type": "exclusive_write", 00:17:57.646 "zoned": false, 00:17:57.646 "supported_io_types": { 00:17:57.646 "read": true, 00:17:57.646 "write": true, 00:17:57.646 "unmap": true, 00:17:57.646 "write_zeroes": true, 00:17:57.646 "flush": true, 00:17:57.646 "reset": true, 00:17:57.646 "compare": false, 00:17:57.646 "compare_and_write": false, 00:17:57.646 "abort": true, 00:17:57.646 "nvme_admin": false, 00:17:57.646 "nvme_io": false 00:17:57.646 }, 00:17:57.646 "memory_domains": [ 00:17:57.646 { 00:17:57.646 "dma_device_id": "system", 00:17:57.646 "dma_device_type": 1 00:17:57.646 }, 00:17:57.646 { 00:17:57.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.646 "dma_device_type": 2 00:17:57.646 } 00:17:57.646 ], 00:17:57.646 "driver_specific": {} 00:17:57.646 } 00:17:57.646 ] 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.646 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.905 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:57.905 "name": "Existed_Raid", 00:17:57.905 "uuid": "ad90e62b-1aff-49ce-9ec0-14219b20abfb", 00:17:57.905 "strip_size_kb": 0, 00:17:57.905 "state": "online", 00:17:57.905 "raid_level": "raid1", 00:17:57.905 "superblock": false, 00:17:57.905 "num_base_bdevs": 4, 00:17:57.905 "num_base_bdevs_discovered": 4, 00:17:57.905 "num_base_bdevs_operational": 4, 00:17:57.905 "base_bdevs_list": [ 00:17:57.905 { 00:17:57.905 "name": "BaseBdev1", 00:17:57.905 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:57.905 "is_configured": true, 00:17:57.905 "data_offset": 0, 00:17:57.905 "data_size": 65536 00:17:57.905 }, 00:17:57.905 { 00:17:57.905 "name": "BaseBdev2", 00:17:57.905 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:57.905 "is_configured": true, 00:17:57.905 "data_offset": 0, 00:17:57.905 "data_size": 65536 00:17:57.905 }, 00:17:57.905 { 00:17:57.905 "name": "BaseBdev3", 00:17:57.905 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:17:57.905 "is_configured": true, 00:17:57.905 "data_offset": 0, 00:17:57.905 "data_size": 65536 00:17:57.905 }, 00:17:57.905 { 00:17:57.905 "name": "BaseBdev4", 00:17:57.905 "uuid": "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d", 00:17:57.905 "is_configured": true, 00:17:57.905 "data_offset": 0, 00:17:57.905 "data_size": 65536 00:17:57.905 } 00:17:57.905 ] 00:17:57.905 }' 00:17:57.905 11:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:57.905 11:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:58.472 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:58.731 [2024-05-14 11:54:25.660973] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:58.731 "name": "Existed_Raid", 00:17:58.731 "aliases": [ 00:17:58.731 "ad90e62b-1aff-49ce-9ec0-14219b20abfb" 00:17:58.731 ], 00:17:58.731 "product_name": "Raid Volume", 00:17:58.731 "block_size": 512, 00:17:58.731 "num_blocks": 65536, 00:17:58.731 "uuid": "ad90e62b-1aff-49ce-9ec0-14219b20abfb", 00:17:58.731 "assigned_rate_limits": { 00:17:58.731 "rw_ios_per_sec": 0, 00:17:58.731 "rw_mbytes_per_sec": 0, 00:17:58.731 "r_mbytes_per_sec": 0, 00:17:58.731 "w_mbytes_per_sec": 0 00:17:58.731 }, 00:17:58.731 "claimed": false, 00:17:58.731 "zoned": false, 00:17:58.731 "supported_io_types": { 00:17:58.731 "read": true, 00:17:58.731 "write": true, 00:17:58.731 "unmap": false, 00:17:58.731 "write_zeroes": true, 00:17:58.731 "flush": false, 00:17:58.731 "reset": true, 00:17:58.731 "compare": false, 00:17:58.731 "compare_and_write": false, 00:17:58.731 "abort": false, 00:17:58.731 "nvme_admin": false, 00:17:58.731 "nvme_io": false 00:17:58.731 }, 00:17:58.731 "memory_domains": [ 00:17:58.731 { 00:17:58.731 "dma_device_id": "system", 00:17:58.731 "dma_device_type": 1 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.731 "dma_device_type": 2 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "system", 00:17:58.731 "dma_device_type": 1 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.731 "dma_device_type": 2 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "system", 00:17:58.731 "dma_device_type": 1 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.731 "dma_device_type": 2 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "system", 00:17:58.731 "dma_device_type": 1 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.731 "dma_device_type": 2 00:17:58.731 } 00:17:58.731 ], 00:17:58.731 "driver_specific": { 00:17:58.731 "raid": { 00:17:58.731 "uuid": "ad90e62b-1aff-49ce-9ec0-14219b20abfb", 00:17:58.731 "strip_size_kb": 0, 00:17:58.731 "state": "online", 00:17:58.731 "raid_level": "raid1", 00:17:58.731 "superblock": false, 00:17:58.731 "num_base_bdevs": 4, 00:17:58.731 "num_base_bdevs_discovered": 4, 00:17:58.731 "num_base_bdevs_operational": 4, 00:17:58.731 "base_bdevs_list": [ 00:17:58.731 { 00:17:58.731 "name": "BaseBdev1", 00:17:58.731 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:58.731 "is_configured": true, 00:17:58.731 "data_offset": 0, 00:17:58.731 "data_size": 65536 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "name": "BaseBdev2", 00:17:58.731 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:58.731 "is_configured": true, 00:17:58.731 "data_offset": 0, 00:17:58.731 "data_size": 65536 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "name": "BaseBdev3", 00:17:58.731 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:17:58.731 "is_configured": true, 00:17:58.731 "data_offset": 0, 00:17:58.731 "data_size": 65536 00:17:58.731 }, 00:17:58.731 { 00:17:58.731 "name": "BaseBdev4", 00:17:58.731 "uuid": "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d", 00:17:58.731 "is_configured": true, 00:17:58.731 "data_offset": 0, 00:17:58.731 "data_size": 65536 00:17:58.731 } 00:17:58.731 ] 00:17:58.731 } 00:17:58.731 } 00:17:58.731 }' 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:17:58.731 BaseBdev2 00:17:58.731 BaseBdev3 00:17:58.731 BaseBdev4' 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:58.731 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:58.989 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:58.989 "name": "BaseBdev1", 00:17:58.989 "aliases": [ 00:17:58.989 "6a0d17e7-d335-4623-b0ce-3c060215bd91" 00:17:58.989 ], 00:17:58.989 "product_name": "Malloc disk", 00:17:58.989 "block_size": 512, 00:17:58.989 "num_blocks": 65536, 00:17:58.989 "uuid": "6a0d17e7-d335-4623-b0ce-3c060215bd91", 00:17:58.989 "assigned_rate_limits": { 00:17:58.989 "rw_ios_per_sec": 0, 00:17:58.989 "rw_mbytes_per_sec": 0, 00:17:58.989 "r_mbytes_per_sec": 0, 00:17:58.989 "w_mbytes_per_sec": 0 00:17:58.990 }, 00:17:58.990 "claimed": true, 00:17:58.990 "claim_type": "exclusive_write", 00:17:58.990 "zoned": false, 00:17:58.990 "supported_io_types": { 00:17:58.990 "read": true, 00:17:58.990 "write": true, 00:17:58.990 "unmap": true, 00:17:58.990 "write_zeroes": true, 00:17:58.990 "flush": true, 00:17:58.990 "reset": true, 00:17:58.990 "compare": false, 00:17:58.990 "compare_and_write": false, 00:17:58.990 "abort": true, 00:17:58.990 "nvme_admin": false, 00:17:58.990 "nvme_io": false 00:17:58.990 }, 00:17:58.990 "memory_domains": [ 00:17:58.990 { 00:17:58.990 "dma_device_id": "system", 00:17:58.990 "dma_device_type": 1 00:17:58.990 }, 00:17:58.990 { 00:17:58.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.990 "dma_device_type": 2 00:17:58.990 } 00:17:58.990 ], 00:17:58.990 "driver_specific": {} 00:17:58.990 }' 00:17:58.990 11:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:58.990 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:58.990 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:58.990 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:59.248 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:59.507 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:59.507 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:59.507 "name": "BaseBdev2", 00:17:59.507 "aliases": [ 00:17:59.507 "8ef88377-d171-496e-a15a-20f87738c456" 00:17:59.507 ], 00:17:59.507 "product_name": "Malloc disk", 00:17:59.507 "block_size": 512, 00:17:59.507 "num_blocks": 65536, 00:17:59.507 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:17:59.507 "assigned_rate_limits": { 00:17:59.507 "rw_ios_per_sec": 0, 00:17:59.507 "rw_mbytes_per_sec": 0, 00:17:59.507 "r_mbytes_per_sec": 0, 00:17:59.507 "w_mbytes_per_sec": 0 00:17:59.507 }, 00:17:59.507 "claimed": true, 00:17:59.507 "claim_type": "exclusive_write", 00:17:59.507 "zoned": false, 00:17:59.507 "supported_io_types": { 00:17:59.507 "read": true, 00:17:59.507 "write": true, 00:17:59.507 "unmap": true, 00:17:59.507 "write_zeroes": true, 00:17:59.507 "flush": true, 00:17:59.507 "reset": true, 00:17:59.507 "compare": false, 00:17:59.507 "compare_and_write": false, 00:17:59.507 "abort": true, 00:17:59.507 "nvme_admin": false, 00:17:59.507 "nvme_io": false 00:17:59.507 }, 00:17:59.507 "memory_domains": [ 00:17:59.507 { 00:17:59.507 "dma_device_id": "system", 00:17:59.507 "dma_device_type": 1 00:17:59.507 }, 00:17:59.507 { 00:17:59.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.507 "dma_device_type": 2 00:17:59.507 } 00:17:59.507 ], 00:17:59.507 "driver_specific": {} 00:17:59.507 }' 00:17:59.507 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.765 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:00.024 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:00.024 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:00.024 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:00.024 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:00.024 11:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:00.282 "name": "BaseBdev3", 00:18:00.282 "aliases": [ 00:18:00.282 "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311" 00:18:00.282 ], 00:18:00.282 "product_name": "Malloc disk", 00:18:00.282 "block_size": 512, 00:18:00.282 "num_blocks": 65536, 00:18:00.282 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:18:00.282 "assigned_rate_limits": { 00:18:00.282 "rw_ios_per_sec": 0, 00:18:00.282 "rw_mbytes_per_sec": 0, 00:18:00.282 "r_mbytes_per_sec": 0, 00:18:00.282 "w_mbytes_per_sec": 0 00:18:00.282 }, 00:18:00.282 "claimed": true, 00:18:00.282 "claim_type": "exclusive_write", 00:18:00.282 "zoned": false, 00:18:00.282 "supported_io_types": { 00:18:00.282 "read": true, 00:18:00.282 "write": true, 00:18:00.282 "unmap": true, 00:18:00.282 "write_zeroes": true, 00:18:00.282 "flush": true, 00:18:00.282 "reset": true, 00:18:00.282 "compare": false, 00:18:00.282 "compare_and_write": false, 00:18:00.282 "abort": true, 00:18:00.282 "nvme_admin": false, 00:18:00.282 "nvme_io": false 00:18:00.282 }, 00:18:00.282 "memory_domains": [ 00:18:00.282 { 00:18:00.282 "dma_device_id": "system", 00:18:00.282 "dma_device_type": 1 00:18:00.282 }, 00:18:00.282 { 00:18:00.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.282 "dma_device_type": 2 00:18:00.282 } 00:18:00.282 ], 00:18:00.282 "driver_specific": {} 00:18:00.282 }' 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.282 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:00.542 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:00.801 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:00.801 "name": "BaseBdev4", 00:18:00.801 "aliases": [ 00:18:00.801 "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d" 00:18:00.801 ], 00:18:00.801 "product_name": "Malloc disk", 00:18:00.801 "block_size": 512, 00:18:00.801 "num_blocks": 65536, 00:18:00.801 "uuid": "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d", 00:18:00.801 "assigned_rate_limits": { 00:18:00.801 "rw_ios_per_sec": 0, 00:18:00.801 "rw_mbytes_per_sec": 0, 00:18:00.801 "r_mbytes_per_sec": 0, 00:18:00.801 "w_mbytes_per_sec": 0 00:18:00.801 }, 00:18:00.801 "claimed": true, 00:18:00.801 "claim_type": "exclusive_write", 00:18:00.801 "zoned": false, 00:18:00.801 "supported_io_types": { 00:18:00.801 "read": true, 00:18:00.801 "write": true, 00:18:00.801 "unmap": true, 00:18:00.801 "write_zeroes": true, 00:18:00.801 "flush": true, 00:18:00.801 "reset": true, 00:18:00.801 "compare": false, 00:18:00.801 "compare_and_write": false, 00:18:00.801 "abort": true, 00:18:00.801 "nvme_admin": false, 00:18:00.801 "nvme_io": false 00:18:00.801 }, 00:18:00.801 "memory_domains": [ 00:18:00.801 { 00:18:00.801 "dma_device_id": "system", 00:18:00.801 "dma_device_type": 1 00:18:00.801 }, 00:18:00.801 { 00:18:00.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.801 "dma_device_type": 2 00:18:00.801 } 00:18:00.801 ], 00:18:00.801 "driver_specific": {} 00:18:00.801 }' 00:18:00.801 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:00.801 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:00.801 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:00.801 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:01.060 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:01.060 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.060 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:01.060 11:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:01.060 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.060 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:01.060 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:01.060 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:01.060 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:01.319 [2024-05-14 11:54:28.343872] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.319 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.578 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:01.578 "name": "Existed_Raid", 00:18:01.578 "uuid": "ad90e62b-1aff-49ce-9ec0-14219b20abfb", 00:18:01.578 "strip_size_kb": 0, 00:18:01.578 "state": "online", 00:18:01.578 "raid_level": "raid1", 00:18:01.578 "superblock": false, 00:18:01.578 "num_base_bdevs": 4, 00:18:01.578 "num_base_bdevs_discovered": 3, 00:18:01.578 "num_base_bdevs_operational": 3, 00:18:01.578 "base_bdevs_list": [ 00:18:01.578 { 00:18:01.578 "name": null, 00:18:01.578 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.578 "is_configured": false, 00:18:01.578 "data_offset": 0, 00:18:01.578 "data_size": 65536 00:18:01.578 }, 00:18:01.578 { 00:18:01.578 "name": "BaseBdev2", 00:18:01.578 "uuid": "8ef88377-d171-496e-a15a-20f87738c456", 00:18:01.578 "is_configured": true, 00:18:01.578 "data_offset": 0, 00:18:01.578 "data_size": 65536 00:18:01.578 }, 00:18:01.578 { 00:18:01.578 "name": "BaseBdev3", 00:18:01.578 "uuid": "8747ad2e-6e01-4b68-8cc4-09cfb4c5f311", 00:18:01.578 "is_configured": true, 00:18:01.578 "data_offset": 0, 00:18:01.578 "data_size": 65536 00:18:01.578 }, 00:18:01.578 { 00:18:01.578 "name": "BaseBdev4", 00:18:01.578 "uuid": "a00b0c7f-d4ce-4f4e-a723-7080f21fe62d", 00:18:01.578 "is_configured": true, 00:18:01.578 "data_offset": 0, 00:18:01.578 "data_size": 65536 00:18:01.578 } 00:18:01.578 ] 00:18:01.578 }' 00:18:01.578 11:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:01.578 11:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.145 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:18:02.145 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:02.145 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.145 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:02.404 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:02.404 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.404 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:02.662 [2024-05-14 11:54:29.676470] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:02.662 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:02.662 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:02.662 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.662 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:02.922 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:02.922 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.922 11:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:03.181 [2024-05-14 11:54:30.178272] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:03.181 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:03.181 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:03.181 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.181 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:03.440 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:03.440 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:03.440 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:03.698 [2024-05-14 11:54:30.663621] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:03.698 [2024-05-14 11:54:30.663693] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.698 [2024-05-14 11:54:30.674459] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.698 [2024-05-14 11:54:30.674526] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:03.698 [2024-05-14 11:54:30.674539] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a691b0 name Existed_Raid, state offline 00:18:03.698 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:03.698 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:03.698 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:18:03.698 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:03.956 11:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:04.215 BaseBdev2 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:04.215 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.474 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:04.732 [ 00:18:04.732 { 00:18:04.732 "name": "BaseBdev2", 00:18:04.732 "aliases": [ 00:18:04.732 "368034c6-d115-4862-82a3-46908ff4a887" 00:18:04.732 ], 00:18:04.732 "product_name": "Malloc disk", 00:18:04.732 "block_size": 512, 00:18:04.732 "num_blocks": 65536, 00:18:04.732 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:04.732 "assigned_rate_limits": { 00:18:04.732 "rw_ios_per_sec": 0, 00:18:04.732 "rw_mbytes_per_sec": 0, 00:18:04.732 "r_mbytes_per_sec": 0, 00:18:04.732 "w_mbytes_per_sec": 0 00:18:04.732 }, 00:18:04.732 "claimed": false, 00:18:04.732 "zoned": false, 00:18:04.732 "supported_io_types": { 00:18:04.732 "read": true, 00:18:04.732 "write": true, 00:18:04.732 "unmap": true, 00:18:04.732 "write_zeroes": true, 00:18:04.732 "flush": true, 00:18:04.732 "reset": true, 00:18:04.732 "compare": false, 00:18:04.732 "compare_and_write": false, 00:18:04.732 "abort": true, 00:18:04.732 "nvme_admin": false, 00:18:04.732 "nvme_io": false 00:18:04.732 }, 00:18:04.732 "memory_domains": [ 00:18:04.732 { 00:18:04.732 "dma_device_id": "system", 00:18:04.732 "dma_device_type": 1 00:18:04.732 }, 00:18:04.732 { 00:18:04.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.732 "dma_device_type": 2 00:18:04.732 } 00:18:04.732 ], 00:18:04.732 "driver_specific": {} 00:18:04.732 } 00:18:04.732 ] 00:18:04.732 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:04.732 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:04.732 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:04.732 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:04.991 BaseBdev3 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:04.991 11:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.250 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:05.507 [ 00:18:05.507 { 00:18:05.507 "name": "BaseBdev3", 00:18:05.507 "aliases": [ 00:18:05.507 "998803f2-118a-42c0-ad90-b1e11b4839e8" 00:18:05.507 ], 00:18:05.507 "product_name": "Malloc disk", 00:18:05.507 "block_size": 512, 00:18:05.507 "num_blocks": 65536, 00:18:05.507 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:05.507 "assigned_rate_limits": { 00:18:05.507 "rw_ios_per_sec": 0, 00:18:05.507 "rw_mbytes_per_sec": 0, 00:18:05.507 "r_mbytes_per_sec": 0, 00:18:05.507 "w_mbytes_per_sec": 0 00:18:05.507 }, 00:18:05.507 "claimed": false, 00:18:05.507 "zoned": false, 00:18:05.507 "supported_io_types": { 00:18:05.507 "read": true, 00:18:05.507 "write": true, 00:18:05.507 "unmap": true, 00:18:05.507 "write_zeroes": true, 00:18:05.507 "flush": true, 00:18:05.507 "reset": true, 00:18:05.507 "compare": false, 00:18:05.507 "compare_and_write": false, 00:18:05.507 "abort": true, 00:18:05.507 "nvme_admin": false, 00:18:05.507 "nvme_io": false 00:18:05.507 }, 00:18:05.507 "memory_domains": [ 00:18:05.507 { 00:18:05.507 "dma_device_id": "system", 00:18:05.507 "dma_device_type": 1 00:18:05.507 }, 00:18:05.507 { 00:18:05.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.507 "dma_device_type": 2 00:18:05.507 } 00:18:05.507 ], 00:18:05.507 "driver_specific": {} 00:18:05.507 } 00:18:05.507 ] 00:18:05.507 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:05.507 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:05.507 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:05.507 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:05.766 BaseBdev4 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:05.766 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.024 11:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:06.024 [ 00:18:06.024 { 00:18:06.024 "name": "BaseBdev4", 00:18:06.024 "aliases": [ 00:18:06.024 "473bc47d-0c4f-4ca8-80de-2611c471b970" 00:18:06.024 ], 00:18:06.024 "product_name": "Malloc disk", 00:18:06.024 "block_size": 512, 00:18:06.024 "num_blocks": 65536, 00:18:06.024 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:06.024 "assigned_rate_limits": { 00:18:06.024 "rw_ios_per_sec": 0, 00:18:06.024 "rw_mbytes_per_sec": 0, 00:18:06.024 "r_mbytes_per_sec": 0, 00:18:06.024 "w_mbytes_per_sec": 0 00:18:06.024 }, 00:18:06.024 "claimed": false, 00:18:06.024 "zoned": false, 00:18:06.024 "supported_io_types": { 00:18:06.024 "read": true, 00:18:06.024 "write": true, 00:18:06.024 "unmap": true, 00:18:06.024 "write_zeroes": true, 00:18:06.024 "flush": true, 00:18:06.024 "reset": true, 00:18:06.024 "compare": false, 00:18:06.024 "compare_and_write": false, 00:18:06.024 "abort": true, 00:18:06.024 "nvme_admin": false, 00:18:06.025 "nvme_io": false 00:18:06.025 }, 00:18:06.025 "memory_domains": [ 00:18:06.025 { 00:18:06.025 "dma_device_id": "system", 00:18:06.025 "dma_device_type": 1 00:18:06.025 }, 00:18:06.025 { 00:18:06.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.025 "dma_device_type": 2 00:18:06.025 } 00:18:06.025 ], 00:18:06.025 "driver_specific": {} 00:18:06.025 } 00:18:06.025 ] 00:18:06.025 11:54:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:06.025 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:06.025 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:06.025 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:06.283 [2024-05-14 11:54:33.327986] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:06.283 [2024-05-14 11:54:33.328036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:06.283 [2024-05-14 11:54:33.328058] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:06.283 [2024-05-14 11:54:33.329487] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:06.283 [2024-05-14 11:54:33.329534] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.283 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.541 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:06.541 "name": "Existed_Raid", 00:18:06.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.541 "strip_size_kb": 0, 00:18:06.541 "state": "configuring", 00:18:06.541 "raid_level": "raid1", 00:18:06.541 "superblock": false, 00:18:06.541 "num_base_bdevs": 4, 00:18:06.541 "num_base_bdevs_discovered": 3, 00:18:06.541 "num_base_bdevs_operational": 4, 00:18:06.541 "base_bdevs_list": [ 00:18:06.541 { 00:18:06.541 "name": "BaseBdev1", 00:18:06.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.541 "is_configured": false, 00:18:06.541 "data_offset": 0, 00:18:06.541 "data_size": 0 00:18:06.541 }, 00:18:06.541 { 00:18:06.541 "name": "BaseBdev2", 00:18:06.541 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:06.541 "is_configured": true, 00:18:06.541 "data_offset": 0, 00:18:06.541 "data_size": 65536 00:18:06.541 }, 00:18:06.541 { 00:18:06.541 "name": "BaseBdev3", 00:18:06.541 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:06.541 "is_configured": true, 00:18:06.541 "data_offset": 0, 00:18:06.541 "data_size": 65536 00:18:06.541 }, 00:18:06.541 { 00:18:06.541 "name": "BaseBdev4", 00:18:06.541 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:06.541 "is_configured": true, 00:18:06.541 "data_offset": 0, 00:18:06.541 "data_size": 65536 00:18:06.541 } 00:18:06.541 ] 00:18:06.541 }' 00:18:06.541 11:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:06.541 11:54:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.108 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:07.366 [2024-05-14 11:54:34.398795] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.366 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.623 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:07.623 "name": "Existed_Raid", 00:18:07.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.623 "strip_size_kb": 0, 00:18:07.623 "state": "configuring", 00:18:07.623 "raid_level": "raid1", 00:18:07.623 "superblock": false, 00:18:07.624 "num_base_bdevs": 4, 00:18:07.624 "num_base_bdevs_discovered": 2, 00:18:07.624 "num_base_bdevs_operational": 4, 00:18:07.624 "base_bdevs_list": [ 00:18:07.624 { 00:18:07.624 "name": "BaseBdev1", 00:18:07.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.624 "is_configured": false, 00:18:07.624 "data_offset": 0, 00:18:07.624 "data_size": 0 00:18:07.624 }, 00:18:07.624 { 00:18:07.624 "name": null, 00:18:07.624 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:07.624 "is_configured": false, 00:18:07.624 "data_offset": 0, 00:18:07.624 "data_size": 65536 00:18:07.624 }, 00:18:07.624 { 00:18:07.624 "name": "BaseBdev3", 00:18:07.624 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:07.624 "is_configured": true, 00:18:07.624 "data_offset": 0, 00:18:07.624 "data_size": 65536 00:18:07.624 }, 00:18:07.624 { 00:18:07.624 "name": "BaseBdev4", 00:18:07.624 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:07.624 "is_configured": true, 00:18:07.624 "data_offset": 0, 00:18:07.624 "data_size": 65536 00:18:07.624 } 00:18:07.624 ] 00:18:07.624 }' 00:18:07.624 11:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:07.624 11:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.190 11:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.190 11:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:08.448 11:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:18:08.448 11:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:08.742 [2024-05-14 11:54:35.645469] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:08.742 BaseBdev1 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:08.742 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:09.001 11:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:09.259 [ 00:18:09.259 { 00:18:09.259 "name": "BaseBdev1", 00:18:09.259 "aliases": [ 00:18:09.259 "64b1e63c-2e2b-47cf-9086-fdf97114bf91" 00:18:09.259 ], 00:18:09.259 "product_name": "Malloc disk", 00:18:09.259 "block_size": 512, 00:18:09.259 "num_blocks": 65536, 00:18:09.259 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:09.259 "assigned_rate_limits": { 00:18:09.259 "rw_ios_per_sec": 0, 00:18:09.259 "rw_mbytes_per_sec": 0, 00:18:09.259 "r_mbytes_per_sec": 0, 00:18:09.259 "w_mbytes_per_sec": 0 00:18:09.259 }, 00:18:09.259 "claimed": true, 00:18:09.259 "claim_type": "exclusive_write", 00:18:09.259 "zoned": false, 00:18:09.259 "supported_io_types": { 00:18:09.259 "read": true, 00:18:09.259 "write": true, 00:18:09.259 "unmap": true, 00:18:09.259 "write_zeroes": true, 00:18:09.259 "flush": true, 00:18:09.259 "reset": true, 00:18:09.259 "compare": false, 00:18:09.259 "compare_and_write": false, 00:18:09.259 "abort": true, 00:18:09.259 "nvme_admin": false, 00:18:09.259 "nvme_io": false 00:18:09.259 }, 00:18:09.259 "memory_domains": [ 00:18:09.259 { 00:18:09.259 "dma_device_id": "system", 00:18:09.259 "dma_device_type": 1 00:18:09.259 }, 00:18:09.259 { 00:18:09.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.259 "dma_device_type": 2 00:18:09.259 } 00:18:09.259 ], 00:18:09.259 "driver_specific": {} 00:18:09.259 } 00:18:09.259 ] 00:18:09.259 11:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.260 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.518 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:09.518 "name": "Existed_Raid", 00:18:09.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.518 "strip_size_kb": 0, 00:18:09.518 "state": "configuring", 00:18:09.518 "raid_level": "raid1", 00:18:09.518 "superblock": false, 00:18:09.518 "num_base_bdevs": 4, 00:18:09.518 "num_base_bdevs_discovered": 3, 00:18:09.518 "num_base_bdevs_operational": 4, 00:18:09.518 "base_bdevs_list": [ 00:18:09.518 { 00:18:09.518 "name": "BaseBdev1", 00:18:09.518 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:09.518 "is_configured": true, 00:18:09.518 "data_offset": 0, 00:18:09.518 "data_size": 65536 00:18:09.518 }, 00:18:09.518 { 00:18:09.518 "name": null, 00:18:09.518 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:09.518 "is_configured": false, 00:18:09.518 "data_offset": 0, 00:18:09.518 "data_size": 65536 00:18:09.518 }, 00:18:09.518 { 00:18:09.518 "name": "BaseBdev3", 00:18:09.518 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:09.518 "is_configured": true, 00:18:09.518 "data_offset": 0, 00:18:09.518 "data_size": 65536 00:18:09.518 }, 00:18:09.518 { 00:18:09.518 "name": "BaseBdev4", 00:18:09.518 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:09.518 "is_configured": true, 00:18:09.518 "data_offset": 0, 00:18:09.518 "data_size": 65536 00:18:09.518 } 00:18:09.518 ] 00:18:09.518 }' 00:18:09.518 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:09.518 11:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.085 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.085 11:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:10.343 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:18:10.343 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:10.343 [2024-05-14 11:54:37.426231] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.602 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.861 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:10.861 "name": "Existed_Raid", 00:18:10.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.861 "strip_size_kb": 0, 00:18:10.861 "state": "configuring", 00:18:10.861 "raid_level": "raid1", 00:18:10.861 "superblock": false, 00:18:10.861 "num_base_bdevs": 4, 00:18:10.861 "num_base_bdevs_discovered": 2, 00:18:10.861 "num_base_bdevs_operational": 4, 00:18:10.861 "base_bdevs_list": [ 00:18:10.861 { 00:18:10.861 "name": "BaseBdev1", 00:18:10.861 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:10.861 "is_configured": true, 00:18:10.861 "data_offset": 0, 00:18:10.861 "data_size": 65536 00:18:10.861 }, 00:18:10.861 { 00:18:10.861 "name": null, 00:18:10.861 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:10.861 "is_configured": false, 00:18:10.861 "data_offset": 0, 00:18:10.861 "data_size": 65536 00:18:10.861 }, 00:18:10.861 { 00:18:10.861 "name": null, 00:18:10.861 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:10.861 "is_configured": false, 00:18:10.861 "data_offset": 0, 00:18:10.861 "data_size": 65536 00:18:10.861 }, 00:18:10.861 { 00:18:10.861 "name": "BaseBdev4", 00:18:10.861 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:10.861 "is_configured": true, 00:18:10.861 "data_offset": 0, 00:18:10.861 "data_size": 65536 00:18:10.861 } 00:18:10.861 ] 00:18:10.861 }' 00:18:10.861 11:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:10.861 11:54:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.427 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.427 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:11.686 [2024-05-14 11:54:38.737732] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.686 11:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.944 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:11.944 "name": "Existed_Raid", 00:18:11.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.944 "strip_size_kb": 0, 00:18:11.944 "state": "configuring", 00:18:11.944 "raid_level": "raid1", 00:18:11.944 "superblock": false, 00:18:11.944 "num_base_bdevs": 4, 00:18:11.944 "num_base_bdevs_discovered": 3, 00:18:11.944 "num_base_bdevs_operational": 4, 00:18:11.944 "base_bdevs_list": [ 00:18:11.944 { 00:18:11.944 "name": "BaseBdev1", 00:18:11.944 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:11.944 "is_configured": true, 00:18:11.944 "data_offset": 0, 00:18:11.944 "data_size": 65536 00:18:11.944 }, 00:18:11.944 { 00:18:11.944 "name": null, 00:18:11.944 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:11.944 "is_configured": false, 00:18:11.944 "data_offset": 0, 00:18:11.944 "data_size": 65536 00:18:11.944 }, 00:18:11.944 { 00:18:11.944 "name": "BaseBdev3", 00:18:11.944 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:11.944 "is_configured": true, 00:18:11.944 "data_offset": 0, 00:18:11.944 "data_size": 65536 00:18:11.944 }, 00:18:11.944 { 00:18:11.944 "name": "BaseBdev4", 00:18:11.944 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:11.944 "is_configured": true, 00:18:11.944 "data_offset": 0, 00:18:11.944 "data_size": 65536 00:18:11.944 } 00:18:11.944 ] 00:18:11.944 }' 00:18:11.944 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:11.944 11:54:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.879 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.879 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:12.879 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:18:12.879 11:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:13.145 [2024-05-14 11:54:40.073313] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:13.145 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:13.145 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:13.145 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:13.145 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.146 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.408 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:13.408 "name": "Existed_Raid", 00:18:13.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.408 "strip_size_kb": 0, 00:18:13.408 "state": "configuring", 00:18:13.408 "raid_level": "raid1", 00:18:13.408 "superblock": false, 00:18:13.408 "num_base_bdevs": 4, 00:18:13.408 "num_base_bdevs_discovered": 2, 00:18:13.408 "num_base_bdevs_operational": 4, 00:18:13.408 "base_bdevs_list": [ 00:18:13.408 { 00:18:13.408 "name": null, 00:18:13.408 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:13.408 "is_configured": false, 00:18:13.408 "data_offset": 0, 00:18:13.408 "data_size": 65536 00:18:13.408 }, 00:18:13.408 { 00:18:13.408 "name": null, 00:18:13.408 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:13.408 "is_configured": false, 00:18:13.408 "data_offset": 0, 00:18:13.408 "data_size": 65536 00:18:13.408 }, 00:18:13.408 { 00:18:13.408 "name": "BaseBdev3", 00:18:13.408 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:13.408 "is_configured": true, 00:18:13.408 "data_offset": 0, 00:18:13.408 "data_size": 65536 00:18:13.408 }, 00:18:13.408 { 00:18:13.408 "name": "BaseBdev4", 00:18:13.408 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:13.408 "is_configured": true, 00:18:13.408 "data_offset": 0, 00:18:13.408 "data_size": 65536 00:18:13.408 } 00:18:13.408 ] 00:18:13.408 }' 00:18:13.408 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:13.408 11:54:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.975 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:13.975 11:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.234 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:18:14.234 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:14.494 [2024-05-14 11:54:41.433457] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.494 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.753 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:14.753 "name": "Existed_Raid", 00:18:14.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.753 "strip_size_kb": 0, 00:18:14.753 "state": "configuring", 00:18:14.753 "raid_level": "raid1", 00:18:14.753 "superblock": false, 00:18:14.753 "num_base_bdevs": 4, 00:18:14.753 "num_base_bdevs_discovered": 3, 00:18:14.753 "num_base_bdevs_operational": 4, 00:18:14.753 "base_bdevs_list": [ 00:18:14.753 { 00:18:14.753 "name": null, 00:18:14.753 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:14.753 "is_configured": false, 00:18:14.753 "data_offset": 0, 00:18:14.753 "data_size": 65536 00:18:14.753 }, 00:18:14.753 { 00:18:14.753 "name": "BaseBdev2", 00:18:14.753 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:14.753 "is_configured": true, 00:18:14.753 "data_offset": 0, 00:18:14.753 "data_size": 65536 00:18:14.753 }, 00:18:14.753 { 00:18:14.753 "name": "BaseBdev3", 00:18:14.753 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:14.753 "is_configured": true, 00:18:14.753 "data_offset": 0, 00:18:14.753 "data_size": 65536 00:18:14.753 }, 00:18:14.753 { 00:18:14.753 "name": "BaseBdev4", 00:18:14.753 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:14.753 "is_configured": true, 00:18:14.753 "data_offset": 0, 00:18:14.753 "data_size": 65536 00:18:14.753 } 00:18:14.753 ] 00:18:14.753 }' 00:18:14.753 11:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:14.753 11:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.321 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.321 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:15.579 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:18:15.579 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.579 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:15.838 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 64b1e63c-2e2b-47cf-9086-fdf97114bf91 00:18:16.119 [2024-05-14 11:54:42.982217] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:16.119 [2024-05-14 11:54:42.982266] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a69450 00:18:16.119 [2024-05-14 11:54:42.982275] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:16.119 [2024-05-14 11:54:42.982485] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a69e60 00:18:16.119 [2024-05-14 11:54:42.982630] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a69450 00:18:16.119 [2024-05-14 11:54:42.982640] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a69450 00:18:16.119 [2024-05-14 11:54:42.982815] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.119 NewBaseBdev 00:18:16.119 11:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:18:16.119 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:18:16.120 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:16.120 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:18:16.120 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:16.120 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:16.120 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.379 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:16.638 [ 00:18:16.638 { 00:18:16.638 "name": "NewBaseBdev", 00:18:16.638 "aliases": [ 00:18:16.638 "64b1e63c-2e2b-47cf-9086-fdf97114bf91" 00:18:16.638 ], 00:18:16.638 "product_name": "Malloc disk", 00:18:16.638 "block_size": 512, 00:18:16.638 "num_blocks": 65536, 00:18:16.638 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:16.638 "assigned_rate_limits": { 00:18:16.638 "rw_ios_per_sec": 0, 00:18:16.638 "rw_mbytes_per_sec": 0, 00:18:16.638 "r_mbytes_per_sec": 0, 00:18:16.638 "w_mbytes_per_sec": 0 00:18:16.638 }, 00:18:16.638 "claimed": true, 00:18:16.638 "claim_type": "exclusive_write", 00:18:16.638 "zoned": false, 00:18:16.638 "supported_io_types": { 00:18:16.638 "read": true, 00:18:16.638 "write": true, 00:18:16.638 "unmap": true, 00:18:16.638 "write_zeroes": true, 00:18:16.638 "flush": true, 00:18:16.638 "reset": true, 00:18:16.638 "compare": false, 00:18:16.638 "compare_and_write": false, 00:18:16.638 "abort": true, 00:18:16.638 "nvme_admin": false, 00:18:16.638 "nvme_io": false 00:18:16.638 }, 00:18:16.638 "memory_domains": [ 00:18:16.638 { 00:18:16.638 "dma_device_id": "system", 00:18:16.638 "dma_device_type": 1 00:18:16.638 }, 00:18:16.638 { 00:18:16.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.638 "dma_device_type": 2 00:18:16.638 } 00:18:16.638 ], 00:18:16.638 "driver_specific": {} 00:18:16.638 } 00:18:16.638 ] 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.638 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.897 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:16.897 "name": "Existed_Raid", 00:18:16.897 "uuid": "6954028a-cdbc-49c1-b242-0dbfb53a02a5", 00:18:16.897 "strip_size_kb": 0, 00:18:16.897 "state": "online", 00:18:16.897 "raid_level": "raid1", 00:18:16.897 "superblock": false, 00:18:16.897 "num_base_bdevs": 4, 00:18:16.897 "num_base_bdevs_discovered": 4, 00:18:16.897 "num_base_bdevs_operational": 4, 00:18:16.897 "base_bdevs_list": [ 00:18:16.897 { 00:18:16.897 "name": "NewBaseBdev", 00:18:16.897 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:16.897 "is_configured": true, 00:18:16.897 "data_offset": 0, 00:18:16.897 "data_size": 65536 00:18:16.897 }, 00:18:16.897 { 00:18:16.897 "name": "BaseBdev2", 00:18:16.897 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:16.897 "is_configured": true, 00:18:16.897 "data_offset": 0, 00:18:16.897 "data_size": 65536 00:18:16.897 }, 00:18:16.897 { 00:18:16.897 "name": "BaseBdev3", 00:18:16.897 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:16.897 "is_configured": true, 00:18:16.897 "data_offset": 0, 00:18:16.897 "data_size": 65536 00:18:16.897 }, 00:18:16.897 { 00:18:16.897 "name": "BaseBdev4", 00:18:16.897 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:16.897 "is_configured": true, 00:18:16.897 "data_offset": 0, 00:18:16.897 "data_size": 65536 00:18:16.897 } 00:18:16.897 ] 00:18:16.897 }' 00:18:16.897 11:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:16.897 11:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:17.465 [2024-05-14 11:54:44.510595] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.465 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:17.465 "name": "Existed_Raid", 00:18:17.465 "aliases": [ 00:18:17.465 "6954028a-cdbc-49c1-b242-0dbfb53a02a5" 00:18:17.465 ], 00:18:17.465 "product_name": "Raid Volume", 00:18:17.465 "block_size": 512, 00:18:17.465 "num_blocks": 65536, 00:18:17.465 "uuid": "6954028a-cdbc-49c1-b242-0dbfb53a02a5", 00:18:17.465 "assigned_rate_limits": { 00:18:17.465 "rw_ios_per_sec": 0, 00:18:17.465 "rw_mbytes_per_sec": 0, 00:18:17.465 "r_mbytes_per_sec": 0, 00:18:17.465 "w_mbytes_per_sec": 0 00:18:17.465 }, 00:18:17.465 "claimed": false, 00:18:17.465 "zoned": false, 00:18:17.465 "supported_io_types": { 00:18:17.465 "read": true, 00:18:17.465 "write": true, 00:18:17.465 "unmap": false, 00:18:17.465 "write_zeroes": true, 00:18:17.465 "flush": false, 00:18:17.465 "reset": true, 00:18:17.465 "compare": false, 00:18:17.465 "compare_and_write": false, 00:18:17.465 "abort": false, 00:18:17.465 "nvme_admin": false, 00:18:17.465 "nvme_io": false 00:18:17.465 }, 00:18:17.465 "memory_domains": [ 00:18:17.465 { 00:18:17.465 "dma_device_id": "system", 00:18:17.465 "dma_device_type": 1 00:18:17.465 }, 00:18:17.465 { 00:18:17.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.465 "dma_device_type": 2 00:18:17.465 }, 00:18:17.465 { 00:18:17.465 "dma_device_id": "system", 00:18:17.465 "dma_device_type": 1 00:18:17.465 }, 00:18:17.465 { 00:18:17.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.465 "dma_device_type": 2 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "dma_device_id": "system", 00:18:17.466 "dma_device_type": 1 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.466 "dma_device_type": 2 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "dma_device_id": "system", 00:18:17.466 "dma_device_type": 1 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.466 "dma_device_type": 2 00:18:17.466 } 00:18:17.466 ], 00:18:17.466 "driver_specific": { 00:18:17.466 "raid": { 00:18:17.466 "uuid": "6954028a-cdbc-49c1-b242-0dbfb53a02a5", 00:18:17.466 "strip_size_kb": 0, 00:18:17.466 "state": "online", 00:18:17.466 "raid_level": "raid1", 00:18:17.466 "superblock": false, 00:18:17.466 "num_base_bdevs": 4, 00:18:17.466 "num_base_bdevs_discovered": 4, 00:18:17.466 "num_base_bdevs_operational": 4, 00:18:17.466 "base_bdevs_list": [ 00:18:17.466 { 00:18:17.466 "name": "NewBaseBdev", 00:18:17.466 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:17.466 "is_configured": true, 00:18:17.466 "data_offset": 0, 00:18:17.466 "data_size": 65536 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "name": "BaseBdev2", 00:18:17.466 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:17.466 "is_configured": true, 00:18:17.466 "data_offset": 0, 00:18:17.466 "data_size": 65536 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "name": "BaseBdev3", 00:18:17.466 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:17.466 "is_configured": true, 00:18:17.466 "data_offset": 0, 00:18:17.466 "data_size": 65536 00:18:17.466 }, 00:18:17.466 { 00:18:17.466 "name": "BaseBdev4", 00:18:17.466 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:17.466 "is_configured": true, 00:18:17.466 "data_offset": 0, 00:18:17.466 "data_size": 65536 00:18:17.466 } 00:18:17.466 ] 00:18:17.466 } 00:18:17.466 } 00:18:17.466 }' 00:18:17.466 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.724 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:18:17.724 BaseBdev2 00:18:17.724 BaseBdev3 00:18:17.724 BaseBdev4' 00:18:17.724 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:17.724 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:17.724 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:17.983 "name": "NewBaseBdev", 00:18:17.983 "aliases": [ 00:18:17.983 "64b1e63c-2e2b-47cf-9086-fdf97114bf91" 00:18:17.983 ], 00:18:17.983 "product_name": "Malloc disk", 00:18:17.983 "block_size": 512, 00:18:17.983 "num_blocks": 65536, 00:18:17.983 "uuid": "64b1e63c-2e2b-47cf-9086-fdf97114bf91", 00:18:17.983 "assigned_rate_limits": { 00:18:17.983 "rw_ios_per_sec": 0, 00:18:17.983 "rw_mbytes_per_sec": 0, 00:18:17.983 "r_mbytes_per_sec": 0, 00:18:17.983 "w_mbytes_per_sec": 0 00:18:17.983 }, 00:18:17.983 "claimed": true, 00:18:17.983 "claim_type": "exclusive_write", 00:18:17.983 "zoned": false, 00:18:17.983 "supported_io_types": { 00:18:17.983 "read": true, 00:18:17.983 "write": true, 00:18:17.983 "unmap": true, 00:18:17.983 "write_zeroes": true, 00:18:17.983 "flush": true, 00:18:17.983 "reset": true, 00:18:17.983 "compare": false, 00:18:17.983 "compare_and_write": false, 00:18:17.983 "abort": true, 00:18:17.983 "nvme_admin": false, 00:18:17.983 "nvme_io": false 00:18:17.983 }, 00:18:17.983 "memory_domains": [ 00:18:17.983 { 00:18:17.983 "dma_device_id": "system", 00:18:17.983 "dma_device_type": 1 00:18:17.983 }, 00:18:17.983 { 00:18:17.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.983 "dma_device_type": 2 00:18:17.983 } 00:18:17.983 ], 00:18:17.983 "driver_specific": {} 00:18:17.983 }' 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:17.983 11:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:17.983 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.983 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:17.983 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.242 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:18.501 "name": "BaseBdev2", 00:18:18.501 "aliases": [ 00:18:18.501 "368034c6-d115-4862-82a3-46908ff4a887" 00:18:18.501 ], 00:18:18.501 "product_name": "Malloc disk", 00:18:18.501 "block_size": 512, 00:18:18.501 "num_blocks": 65536, 00:18:18.501 "uuid": "368034c6-d115-4862-82a3-46908ff4a887", 00:18:18.501 "assigned_rate_limits": { 00:18:18.501 "rw_ios_per_sec": 0, 00:18:18.501 "rw_mbytes_per_sec": 0, 00:18:18.501 "r_mbytes_per_sec": 0, 00:18:18.501 "w_mbytes_per_sec": 0 00:18:18.501 }, 00:18:18.501 "claimed": true, 00:18:18.501 "claim_type": "exclusive_write", 00:18:18.501 "zoned": false, 00:18:18.501 "supported_io_types": { 00:18:18.501 "read": true, 00:18:18.501 "write": true, 00:18:18.501 "unmap": true, 00:18:18.501 "write_zeroes": true, 00:18:18.501 "flush": true, 00:18:18.501 "reset": true, 00:18:18.501 "compare": false, 00:18:18.501 "compare_and_write": false, 00:18:18.501 "abort": true, 00:18:18.501 "nvme_admin": false, 00:18:18.501 "nvme_io": false 00:18:18.501 }, 00:18:18.501 "memory_domains": [ 00:18:18.501 { 00:18:18.501 "dma_device_id": "system", 00:18:18.501 "dma_device_type": 1 00:18:18.501 }, 00:18:18.501 { 00:18:18.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.501 "dma_device_type": 2 00:18:18.501 } 00:18:18.501 ], 00:18:18.501 "driver_specific": {} 00:18:18.501 }' 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.501 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:18.760 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:19.019 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:19.019 "name": "BaseBdev3", 00:18:19.019 "aliases": [ 00:18:19.019 "998803f2-118a-42c0-ad90-b1e11b4839e8" 00:18:19.019 ], 00:18:19.019 "product_name": "Malloc disk", 00:18:19.019 "block_size": 512, 00:18:19.019 "num_blocks": 65536, 00:18:19.019 "uuid": "998803f2-118a-42c0-ad90-b1e11b4839e8", 00:18:19.019 "assigned_rate_limits": { 00:18:19.019 "rw_ios_per_sec": 0, 00:18:19.019 "rw_mbytes_per_sec": 0, 00:18:19.019 "r_mbytes_per_sec": 0, 00:18:19.019 "w_mbytes_per_sec": 0 00:18:19.019 }, 00:18:19.019 "claimed": true, 00:18:19.019 "claim_type": "exclusive_write", 00:18:19.019 "zoned": false, 00:18:19.019 "supported_io_types": { 00:18:19.019 "read": true, 00:18:19.019 "write": true, 00:18:19.019 "unmap": true, 00:18:19.019 "write_zeroes": true, 00:18:19.019 "flush": true, 00:18:19.019 "reset": true, 00:18:19.019 "compare": false, 00:18:19.019 "compare_and_write": false, 00:18:19.019 "abort": true, 00:18:19.019 "nvme_admin": false, 00:18:19.019 "nvme_io": false 00:18:19.019 }, 00:18:19.019 "memory_domains": [ 00:18:19.019 { 00:18:19.019 "dma_device_id": "system", 00:18:19.019 "dma_device_type": 1 00:18:19.019 }, 00:18:19.019 { 00:18:19.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.019 "dma_device_type": 2 00:18:19.019 } 00:18:19.019 ], 00:18:19.019 "driver_specific": {} 00:18:19.019 }' 00:18:19.019 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:19.019 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:19.019 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:19.019 11:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:19.019 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:19.019 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.019 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:19.278 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:19.536 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:19.536 "name": "BaseBdev4", 00:18:19.536 "aliases": [ 00:18:19.536 "473bc47d-0c4f-4ca8-80de-2611c471b970" 00:18:19.536 ], 00:18:19.536 "product_name": "Malloc disk", 00:18:19.536 "block_size": 512, 00:18:19.536 "num_blocks": 65536, 00:18:19.536 "uuid": "473bc47d-0c4f-4ca8-80de-2611c471b970", 00:18:19.536 "assigned_rate_limits": { 00:18:19.536 "rw_ios_per_sec": 0, 00:18:19.536 "rw_mbytes_per_sec": 0, 00:18:19.537 "r_mbytes_per_sec": 0, 00:18:19.537 "w_mbytes_per_sec": 0 00:18:19.537 }, 00:18:19.537 "claimed": true, 00:18:19.537 "claim_type": "exclusive_write", 00:18:19.537 "zoned": false, 00:18:19.537 "supported_io_types": { 00:18:19.537 "read": true, 00:18:19.537 "write": true, 00:18:19.537 "unmap": true, 00:18:19.537 "write_zeroes": true, 00:18:19.537 "flush": true, 00:18:19.537 "reset": true, 00:18:19.537 "compare": false, 00:18:19.537 "compare_and_write": false, 00:18:19.537 "abort": true, 00:18:19.537 "nvme_admin": false, 00:18:19.537 "nvme_io": false 00:18:19.537 }, 00:18:19.537 "memory_domains": [ 00:18:19.537 { 00:18:19.537 "dma_device_id": "system", 00:18:19.537 "dma_device_type": 1 00:18:19.537 }, 00:18:19.537 { 00:18:19.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.537 "dma_device_type": 2 00:18:19.537 } 00:18:19.537 ], 00:18:19.537 "driver_specific": {} 00:18:19.537 }' 00:18:19.537 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:19.537 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:19.537 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:19.537 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:19.795 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:19.796 11:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:20.055 [2024-05-14 11:54:47.045016] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:20.055 [2024-05-14 11:54:47.045046] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:20.055 [2024-05-14 11:54:47.045106] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:20.055 [2024-05-14 11:54:47.045396] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:20.055 [2024-05-14 11:54:47.045416] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a69450 name Existed_Raid, state offline 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 1736880 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 1736880 ']' 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 1736880 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1736880 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1736880' 00:18:20.055 killing process with pid 1736880 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 1736880 00:18:20.055 [2024-05-14 11:54:47.094936] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:20.055 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 1736880 00:18:20.055 [2024-05-14 11:54:47.132048] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:20.314 11:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:18:20.314 00:18:20.314 real 0m31.364s 00:18:20.314 user 0m57.570s 00:18:20.314 sys 0m5.584s 00:18:20.314 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:20.314 11:54:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.314 ************************************ 00:18:20.314 END TEST raid_state_function_test 00:18:20.314 ************************************ 00:18:20.314 11:54:47 bdev_raid -- bdev/bdev_raid.sh@816 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:18:20.314 11:54:47 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:18:20.314 11:54:47 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:20.314 11:54:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:20.573 ************************************ 00:18:20.573 START TEST raid_state_function_test_sb 00:18:20.573 ************************************ 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 true 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=1741593 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1741593' 00:18:20.573 Process raid pid: 1741593 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 1741593 /var/tmp/spdk-raid.sock 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1741593 ']' 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:20.573 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:20.573 11:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.573 [2024-05-14 11:54:47.490266] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:18:20.573 [2024-05-14 11:54:47.490328] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:20.573 [2024-05-14 11:54:47.618264] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:20.832 [2024-05-14 11:54:47.723395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.832 [2024-05-14 11:54:47.785700] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:20.832 [2024-05-14 11:54:47.785737] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:21.398 11:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:21.398 11:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:18:21.398 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:21.657 [2024-05-14 11:54:48.634317] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:21.657 [2024-05-14 11:54:48.634358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:21.657 [2024-05-14 11:54:48.634369] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:21.657 [2024-05-14 11:54:48.634381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:21.657 [2024-05-14 11:54:48.634390] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:21.657 [2024-05-14 11:54:48.634407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:21.657 [2024-05-14 11:54:48.634416] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:21.657 [2024-05-14 11:54:48.634432] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.657 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.916 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:21.916 "name": "Existed_Raid", 00:18:21.916 "uuid": "90ad04d7-d2c3-45cb-b830-4af216e2726f", 00:18:21.916 "strip_size_kb": 0, 00:18:21.916 "state": "configuring", 00:18:21.916 "raid_level": "raid1", 00:18:21.916 "superblock": true, 00:18:21.916 "num_base_bdevs": 4, 00:18:21.916 "num_base_bdevs_discovered": 0, 00:18:21.916 "num_base_bdevs_operational": 4, 00:18:21.916 "base_bdevs_list": [ 00:18:21.916 { 00:18:21.916 "name": "BaseBdev1", 00:18:21.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.916 "is_configured": false, 00:18:21.916 "data_offset": 0, 00:18:21.916 "data_size": 0 00:18:21.916 }, 00:18:21.916 { 00:18:21.916 "name": "BaseBdev2", 00:18:21.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.916 "is_configured": false, 00:18:21.916 "data_offset": 0, 00:18:21.916 "data_size": 0 00:18:21.916 }, 00:18:21.916 { 00:18:21.916 "name": "BaseBdev3", 00:18:21.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.916 "is_configured": false, 00:18:21.916 "data_offset": 0, 00:18:21.916 "data_size": 0 00:18:21.916 }, 00:18:21.916 { 00:18:21.916 "name": "BaseBdev4", 00:18:21.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.916 "is_configured": false, 00:18:21.916 "data_offset": 0, 00:18:21.916 "data_size": 0 00:18:21.916 } 00:18:21.916 ] 00:18:21.916 }' 00:18:21.916 11:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:21.916 11:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.484 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:22.745 [2024-05-14 11:54:49.640817] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:22.745 [2024-05-14 11:54:49.640848] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x823720 name Existed_Raid, state configuring 00:18:22.745 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:23.073 [2024-05-14 11:54:49.869460] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:23.073 [2024-05-14 11:54:49.869490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:23.073 [2024-05-14 11:54:49.869500] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:23.073 [2024-05-14 11:54:49.869512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:23.073 [2024-05-14 11:54:49.869521] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:23.073 [2024-05-14 11:54:49.869536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:23.073 [2024-05-14 11:54:49.869545] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:23.073 [2024-05-14 11:54:49.869556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:23.073 11:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:23.073 [2024-05-14 11:54:50.124679] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:23.073 BaseBdev1 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:23.073 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:23.332 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:23.591 [ 00:18:23.591 { 00:18:23.591 "name": "BaseBdev1", 00:18:23.591 "aliases": [ 00:18:23.592 "a85cc84e-8743-4468-b710-3b247201c60f" 00:18:23.592 ], 00:18:23.592 "product_name": "Malloc disk", 00:18:23.592 "block_size": 512, 00:18:23.592 "num_blocks": 65536, 00:18:23.592 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:23.592 "assigned_rate_limits": { 00:18:23.592 "rw_ios_per_sec": 0, 00:18:23.592 "rw_mbytes_per_sec": 0, 00:18:23.592 "r_mbytes_per_sec": 0, 00:18:23.592 "w_mbytes_per_sec": 0 00:18:23.592 }, 00:18:23.592 "claimed": true, 00:18:23.592 "claim_type": "exclusive_write", 00:18:23.592 "zoned": false, 00:18:23.592 "supported_io_types": { 00:18:23.592 "read": true, 00:18:23.592 "write": true, 00:18:23.592 "unmap": true, 00:18:23.592 "write_zeroes": true, 00:18:23.592 "flush": true, 00:18:23.592 "reset": true, 00:18:23.592 "compare": false, 00:18:23.592 "compare_and_write": false, 00:18:23.592 "abort": true, 00:18:23.592 "nvme_admin": false, 00:18:23.592 "nvme_io": false 00:18:23.592 }, 00:18:23.592 "memory_domains": [ 00:18:23.592 { 00:18:23.592 "dma_device_id": "system", 00:18:23.592 "dma_device_type": 1 00:18:23.592 }, 00:18:23.592 { 00:18:23.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.592 "dma_device_type": 2 00:18:23.592 } 00:18:23.592 ], 00:18:23.592 "driver_specific": {} 00:18:23.592 } 00:18:23.592 ] 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.592 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.852 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:23.852 "name": "Existed_Raid", 00:18:23.852 "uuid": "335c7e41-ba63-42e9-87db-eaf97ad376a3", 00:18:23.852 "strip_size_kb": 0, 00:18:23.852 "state": "configuring", 00:18:23.852 "raid_level": "raid1", 00:18:23.852 "superblock": true, 00:18:23.852 "num_base_bdevs": 4, 00:18:23.852 "num_base_bdevs_discovered": 1, 00:18:23.852 "num_base_bdevs_operational": 4, 00:18:23.852 "base_bdevs_list": [ 00:18:23.852 { 00:18:23.852 "name": "BaseBdev1", 00:18:23.852 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:23.852 "is_configured": true, 00:18:23.852 "data_offset": 2048, 00:18:23.852 "data_size": 63488 00:18:23.852 }, 00:18:23.852 { 00:18:23.852 "name": "BaseBdev2", 00:18:23.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.852 "is_configured": false, 00:18:23.852 "data_offset": 0, 00:18:23.852 "data_size": 0 00:18:23.852 }, 00:18:23.852 { 00:18:23.852 "name": "BaseBdev3", 00:18:23.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.852 "is_configured": false, 00:18:23.852 "data_offset": 0, 00:18:23.852 "data_size": 0 00:18:23.852 }, 00:18:23.852 { 00:18:23.852 "name": "BaseBdev4", 00:18:23.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.852 "is_configured": false, 00:18:23.852 "data_offset": 0, 00:18:23.852 "data_size": 0 00:18:23.852 } 00:18:23.852 ] 00:18:23.852 }' 00:18:23.852 11:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:23.852 11:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.419 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:24.678 [2024-05-14 11:54:51.644775] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:24.678 [2024-05-14 11:54:51.644821] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x822fb0 name Existed_Raid, state configuring 00:18:24.678 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:24.938 [2024-05-14 11:54:51.885460] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:24.938 [2024-05-14 11:54:51.886998] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:24.938 [2024-05-14 11:54:51.887031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:24.938 [2024-05-14 11:54:51.887042] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:24.938 [2024-05-14 11:54:51.887054] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:24.938 [2024-05-14 11:54:51.887063] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:24.938 [2024-05-14 11:54:51.887075] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.938 11:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.198 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:25.198 "name": "Existed_Raid", 00:18:25.198 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:25.198 "strip_size_kb": 0, 00:18:25.198 "state": "configuring", 00:18:25.198 "raid_level": "raid1", 00:18:25.198 "superblock": true, 00:18:25.198 "num_base_bdevs": 4, 00:18:25.198 "num_base_bdevs_discovered": 1, 00:18:25.198 "num_base_bdevs_operational": 4, 00:18:25.198 "base_bdevs_list": [ 00:18:25.198 { 00:18:25.198 "name": "BaseBdev1", 00:18:25.198 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:25.198 "is_configured": true, 00:18:25.198 "data_offset": 2048, 00:18:25.198 "data_size": 63488 00:18:25.198 }, 00:18:25.198 { 00:18:25.198 "name": "BaseBdev2", 00:18:25.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.198 "is_configured": false, 00:18:25.198 "data_offset": 0, 00:18:25.198 "data_size": 0 00:18:25.198 }, 00:18:25.198 { 00:18:25.198 "name": "BaseBdev3", 00:18:25.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.198 "is_configured": false, 00:18:25.198 "data_offset": 0, 00:18:25.198 "data_size": 0 00:18:25.198 }, 00:18:25.198 { 00:18:25.198 "name": "BaseBdev4", 00:18:25.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.198 "is_configured": false, 00:18:25.198 "data_offset": 0, 00:18:25.198 "data_size": 0 00:18:25.198 } 00:18:25.198 ] 00:18:25.198 }' 00:18:25.198 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:25.198 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:25.765 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:26.024 [2024-05-14 11:54:52.955612] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:26.024 BaseBdev2 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:26.024 11:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.283 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:26.542 [ 00:18:26.542 { 00:18:26.542 "name": "BaseBdev2", 00:18:26.542 "aliases": [ 00:18:26.542 "8983f020-5010-4d97-8ead-f10094e9a6e3" 00:18:26.542 ], 00:18:26.542 "product_name": "Malloc disk", 00:18:26.542 "block_size": 512, 00:18:26.542 "num_blocks": 65536, 00:18:26.542 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:26.542 "assigned_rate_limits": { 00:18:26.542 "rw_ios_per_sec": 0, 00:18:26.542 "rw_mbytes_per_sec": 0, 00:18:26.542 "r_mbytes_per_sec": 0, 00:18:26.542 "w_mbytes_per_sec": 0 00:18:26.542 }, 00:18:26.542 "claimed": true, 00:18:26.542 "claim_type": "exclusive_write", 00:18:26.542 "zoned": false, 00:18:26.542 "supported_io_types": { 00:18:26.542 "read": true, 00:18:26.542 "write": true, 00:18:26.542 "unmap": true, 00:18:26.542 "write_zeroes": true, 00:18:26.542 "flush": true, 00:18:26.542 "reset": true, 00:18:26.542 "compare": false, 00:18:26.542 "compare_and_write": false, 00:18:26.542 "abort": true, 00:18:26.542 "nvme_admin": false, 00:18:26.542 "nvme_io": false 00:18:26.542 }, 00:18:26.542 "memory_domains": [ 00:18:26.542 { 00:18:26.542 "dma_device_id": "system", 00:18:26.542 "dma_device_type": 1 00:18:26.542 }, 00:18:26.542 { 00:18:26.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.542 "dma_device_type": 2 00:18:26.542 } 00:18:26.542 ], 00:18:26.542 "driver_specific": {} 00:18:26.542 } 00:18:26.542 ] 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.542 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.801 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:26.801 "name": "Existed_Raid", 00:18:26.801 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:26.801 "strip_size_kb": 0, 00:18:26.801 "state": "configuring", 00:18:26.801 "raid_level": "raid1", 00:18:26.801 "superblock": true, 00:18:26.801 "num_base_bdevs": 4, 00:18:26.801 "num_base_bdevs_discovered": 2, 00:18:26.801 "num_base_bdevs_operational": 4, 00:18:26.801 "base_bdevs_list": [ 00:18:26.801 { 00:18:26.801 "name": "BaseBdev1", 00:18:26.801 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:26.801 "is_configured": true, 00:18:26.801 "data_offset": 2048, 00:18:26.801 "data_size": 63488 00:18:26.801 }, 00:18:26.801 { 00:18:26.801 "name": "BaseBdev2", 00:18:26.801 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:26.801 "is_configured": true, 00:18:26.801 "data_offset": 2048, 00:18:26.801 "data_size": 63488 00:18:26.801 }, 00:18:26.801 { 00:18:26.801 "name": "BaseBdev3", 00:18:26.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.801 "is_configured": false, 00:18:26.801 "data_offset": 0, 00:18:26.801 "data_size": 0 00:18:26.801 }, 00:18:26.801 { 00:18:26.801 "name": "BaseBdev4", 00:18:26.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.801 "is_configured": false, 00:18:26.801 "data_offset": 0, 00:18:26.801 "data_size": 0 00:18:26.801 } 00:18:26.801 ] 00:18:26.801 }' 00:18:26.801 11:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:26.801 11:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.367 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:27.626 [2024-05-14 11:54:54.543161] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:27.626 BaseBdev3 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:27.626 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:27.883 11:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:28.142 [ 00:18:28.142 { 00:18:28.142 "name": "BaseBdev3", 00:18:28.142 "aliases": [ 00:18:28.142 "42f61830-092a-45b4-a6e9-b5d2666638ed" 00:18:28.142 ], 00:18:28.142 "product_name": "Malloc disk", 00:18:28.142 "block_size": 512, 00:18:28.142 "num_blocks": 65536, 00:18:28.142 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:28.142 "assigned_rate_limits": { 00:18:28.142 "rw_ios_per_sec": 0, 00:18:28.142 "rw_mbytes_per_sec": 0, 00:18:28.142 "r_mbytes_per_sec": 0, 00:18:28.142 "w_mbytes_per_sec": 0 00:18:28.142 }, 00:18:28.142 "claimed": true, 00:18:28.142 "claim_type": "exclusive_write", 00:18:28.142 "zoned": false, 00:18:28.142 "supported_io_types": { 00:18:28.142 "read": true, 00:18:28.142 "write": true, 00:18:28.142 "unmap": true, 00:18:28.142 "write_zeroes": true, 00:18:28.142 "flush": true, 00:18:28.142 "reset": true, 00:18:28.142 "compare": false, 00:18:28.142 "compare_and_write": false, 00:18:28.142 "abort": true, 00:18:28.142 "nvme_admin": false, 00:18:28.142 "nvme_io": false 00:18:28.142 }, 00:18:28.142 "memory_domains": [ 00:18:28.142 { 00:18:28.142 "dma_device_id": "system", 00:18:28.142 "dma_device_type": 1 00:18:28.142 }, 00:18:28.142 { 00:18:28.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.142 "dma_device_type": 2 00:18:28.142 } 00:18:28.142 ], 00:18:28.142 "driver_specific": {} 00:18:28.142 } 00:18:28.142 ] 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.142 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.401 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:28.401 "name": "Existed_Raid", 00:18:28.401 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:28.401 "strip_size_kb": 0, 00:18:28.401 "state": "configuring", 00:18:28.401 "raid_level": "raid1", 00:18:28.401 "superblock": true, 00:18:28.401 "num_base_bdevs": 4, 00:18:28.401 "num_base_bdevs_discovered": 3, 00:18:28.401 "num_base_bdevs_operational": 4, 00:18:28.401 "base_bdevs_list": [ 00:18:28.401 { 00:18:28.401 "name": "BaseBdev1", 00:18:28.401 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:28.401 "is_configured": true, 00:18:28.401 "data_offset": 2048, 00:18:28.401 "data_size": 63488 00:18:28.401 }, 00:18:28.401 { 00:18:28.401 "name": "BaseBdev2", 00:18:28.401 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:28.401 "is_configured": true, 00:18:28.401 "data_offset": 2048, 00:18:28.401 "data_size": 63488 00:18:28.401 }, 00:18:28.401 { 00:18:28.401 "name": "BaseBdev3", 00:18:28.401 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:28.401 "is_configured": true, 00:18:28.401 "data_offset": 2048, 00:18:28.401 "data_size": 63488 00:18:28.401 }, 00:18:28.401 { 00:18:28.401 "name": "BaseBdev4", 00:18:28.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.401 "is_configured": false, 00:18:28.401 "data_offset": 0, 00:18:28.401 "data_size": 0 00:18:28.401 } 00:18:28.401 ] 00:18:28.401 }' 00:18:28.401 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:28.401 11:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.969 11:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:29.228 [2024-05-14 11:54:56.098609] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:29.228 [2024-05-14 11:54:56.098779] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x8241b0 00:18:29.228 [2024-05-14 11:54:56.098792] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:29.228 [2024-05-14 11:54:56.098974] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x825860 00:18:29.228 [2024-05-14 11:54:56.099106] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8241b0 00:18:29.228 [2024-05-14 11:54:56.099117] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8241b0 00:18:29.228 [2024-05-14 11:54:56.099215] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.228 BaseBdev4 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:29.228 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:29.486 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:29.745 [ 00:18:29.745 { 00:18:29.745 "name": "BaseBdev4", 00:18:29.745 "aliases": [ 00:18:29.745 "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1" 00:18:29.745 ], 00:18:29.745 "product_name": "Malloc disk", 00:18:29.745 "block_size": 512, 00:18:29.745 "num_blocks": 65536, 00:18:29.745 "uuid": "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1", 00:18:29.745 "assigned_rate_limits": { 00:18:29.745 "rw_ios_per_sec": 0, 00:18:29.745 "rw_mbytes_per_sec": 0, 00:18:29.745 "r_mbytes_per_sec": 0, 00:18:29.745 "w_mbytes_per_sec": 0 00:18:29.745 }, 00:18:29.745 "claimed": true, 00:18:29.745 "claim_type": "exclusive_write", 00:18:29.745 "zoned": false, 00:18:29.745 "supported_io_types": { 00:18:29.745 "read": true, 00:18:29.745 "write": true, 00:18:29.745 "unmap": true, 00:18:29.745 "write_zeroes": true, 00:18:29.745 "flush": true, 00:18:29.745 "reset": true, 00:18:29.745 "compare": false, 00:18:29.745 "compare_and_write": false, 00:18:29.745 "abort": true, 00:18:29.745 "nvme_admin": false, 00:18:29.745 "nvme_io": false 00:18:29.745 }, 00:18:29.745 "memory_domains": [ 00:18:29.745 { 00:18:29.745 "dma_device_id": "system", 00:18:29.745 "dma_device_type": 1 00:18:29.745 }, 00:18:29.745 { 00:18:29.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.745 "dma_device_type": 2 00:18:29.745 } 00:18:29.745 ], 00:18:29.745 "driver_specific": {} 00:18:29.745 } 00:18:29.745 ] 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.745 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.003 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:30.003 "name": "Existed_Raid", 00:18:30.003 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:30.003 "strip_size_kb": 0, 00:18:30.003 "state": "online", 00:18:30.003 "raid_level": "raid1", 00:18:30.003 "superblock": true, 00:18:30.003 "num_base_bdevs": 4, 00:18:30.003 "num_base_bdevs_discovered": 4, 00:18:30.003 "num_base_bdevs_operational": 4, 00:18:30.003 "base_bdevs_list": [ 00:18:30.003 { 00:18:30.003 "name": "BaseBdev1", 00:18:30.003 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:30.003 "is_configured": true, 00:18:30.003 "data_offset": 2048, 00:18:30.003 "data_size": 63488 00:18:30.003 }, 00:18:30.003 { 00:18:30.003 "name": "BaseBdev2", 00:18:30.003 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:30.003 "is_configured": true, 00:18:30.003 "data_offset": 2048, 00:18:30.003 "data_size": 63488 00:18:30.003 }, 00:18:30.003 { 00:18:30.003 "name": "BaseBdev3", 00:18:30.003 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:30.003 "is_configured": true, 00:18:30.003 "data_offset": 2048, 00:18:30.003 "data_size": 63488 00:18:30.003 }, 00:18:30.003 { 00:18:30.003 "name": "BaseBdev4", 00:18:30.003 "uuid": "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1", 00:18:30.003 "is_configured": true, 00:18:30.003 "data_offset": 2048, 00:18:30.003 "data_size": 63488 00:18:30.003 } 00:18:30.003 ] 00:18:30.003 }' 00:18:30.003 11:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:30.003 11:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:30.571 [2024-05-14 11:54:57.626966] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:30.571 "name": "Existed_Raid", 00:18:30.571 "aliases": [ 00:18:30.571 "3dbcf786-62f7-4d59-a9da-9e964bb293f9" 00:18:30.571 ], 00:18:30.571 "product_name": "Raid Volume", 00:18:30.571 "block_size": 512, 00:18:30.571 "num_blocks": 63488, 00:18:30.571 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:30.571 "assigned_rate_limits": { 00:18:30.571 "rw_ios_per_sec": 0, 00:18:30.571 "rw_mbytes_per_sec": 0, 00:18:30.571 "r_mbytes_per_sec": 0, 00:18:30.571 "w_mbytes_per_sec": 0 00:18:30.571 }, 00:18:30.571 "claimed": false, 00:18:30.571 "zoned": false, 00:18:30.571 "supported_io_types": { 00:18:30.571 "read": true, 00:18:30.571 "write": true, 00:18:30.571 "unmap": false, 00:18:30.571 "write_zeroes": true, 00:18:30.571 "flush": false, 00:18:30.571 "reset": true, 00:18:30.571 "compare": false, 00:18:30.571 "compare_and_write": false, 00:18:30.571 "abort": false, 00:18:30.571 "nvme_admin": false, 00:18:30.571 "nvme_io": false 00:18:30.571 }, 00:18:30.571 "memory_domains": [ 00:18:30.571 { 00:18:30.571 "dma_device_id": "system", 00:18:30.571 "dma_device_type": 1 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.571 "dma_device_type": 2 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "system", 00:18:30.571 "dma_device_type": 1 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.571 "dma_device_type": 2 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "system", 00:18:30.571 "dma_device_type": 1 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.571 "dma_device_type": 2 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "system", 00:18:30.571 "dma_device_type": 1 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.571 "dma_device_type": 2 00:18:30.571 } 00:18:30.571 ], 00:18:30.571 "driver_specific": { 00:18:30.571 "raid": { 00:18:30.571 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:30.571 "strip_size_kb": 0, 00:18:30.571 "state": "online", 00:18:30.571 "raid_level": "raid1", 00:18:30.571 "superblock": true, 00:18:30.571 "num_base_bdevs": 4, 00:18:30.571 "num_base_bdevs_discovered": 4, 00:18:30.571 "num_base_bdevs_operational": 4, 00:18:30.571 "base_bdevs_list": [ 00:18:30.571 { 00:18:30.571 "name": "BaseBdev1", 00:18:30.571 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:30.571 "is_configured": true, 00:18:30.571 "data_offset": 2048, 00:18:30.571 "data_size": 63488 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "name": "BaseBdev2", 00:18:30.571 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:30.571 "is_configured": true, 00:18:30.571 "data_offset": 2048, 00:18:30.571 "data_size": 63488 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "name": "BaseBdev3", 00:18:30.571 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:30.571 "is_configured": true, 00:18:30.571 "data_offset": 2048, 00:18:30.571 "data_size": 63488 00:18:30.571 }, 00:18:30.571 { 00:18:30.571 "name": "BaseBdev4", 00:18:30.571 "uuid": "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1", 00:18:30.571 "is_configured": true, 00:18:30.571 "data_offset": 2048, 00:18:30.571 "data_size": 63488 00:18:30.571 } 00:18:30.571 ] 00:18:30.571 } 00:18:30.571 } 00:18:30.571 }' 00:18:30.571 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:30.829 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:18:30.830 BaseBdev2 00:18:30.830 BaseBdev3 00:18:30.830 BaseBdev4' 00:18:30.830 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:30.830 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:30.830 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:31.088 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:31.088 "name": "BaseBdev1", 00:18:31.088 "aliases": [ 00:18:31.088 "a85cc84e-8743-4468-b710-3b247201c60f" 00:18:31.088 ], 00:18:31.088 "product_name": "Malloc disk", 00:18:31.088 "block_size": 512, 00:18:31.088 "num_blocks": 65536, 00:18:31.088 "uuid": "a85cc84e-8743-4468-b710-3b247201c60f", 00:18:31.088 "assigned_rate_limits": { 00:18:31.088 "rw_ios_per_sec": 0, 00:18:31.088 "rw_mbytes_per_sec": 0, 00:18:31.088 "r_mbytes_per_sec": 0, 00:18:31.088 "w_mbytes_per_sec": 0 00:18:31.088 }, 00:18:31.088 "claimed": true, 00:18:31.088 "claim_type": "exclusive_write", 00:18:31.088 "zoned": false, 00:18:31.088 "supported_io_types": { 00:18:31.088 "read": true, 00:18:31.088 "write": true, 00:18:31.088 "unmap": true, 00:18:31.088 "write_zeroes": true, 00:18:31.088 "flush": true, 00:18:31.088 "reset": true, 00:18:31.088 "compare": false, 00:18:31.088 "compare_and_write": false, 00:18:31.088 "abort": true, 00:18:31.088 "nvme_admin": false, 00:18:31.088 "nvme_io": false 00:18:31.088 }, 00:18:31.088 "memory_domains": [ 00:18:31.088 { 00:18:31.088 "dma_device_id": "system", 00:18:31.088 "dma_device_type": 1 00:18:31.088 }, 00:18:31.088 { 00:18:31.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.088 "dma_device_type": 2 00:18:31.088 } 00:18:31.088 ], 00:18:31.088 "driver_specific": {} 00:18:31.088 }' 00:18:31.088 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:31.088 11:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.088 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:31.347 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:31.347 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:31.347 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:31.347 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:31.347 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:31.605 "name": "BaseBdev2", 00:18:31.605 "aliases": [ 00:18:31.605 "8983f020-5010-4d97-8ead-f10094e9a6e3" 00:18:31.605 ], 00:18:31.605 "product_name": "Malloc disk", 00:18:31.605 "block_size": 512, 00:18:31.605 "num_blocks": 65536, 00:18:31.605 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:31.605 "assigned_rate_limits": { 00:18:31.605 "rw_ios_per_sec": 0, 00:18:31.605 "rw_mbytes_per_sec": 0, 00:18:31.605 "r_mbytes_per_sec": 0, 00:18:31.605 "w_mbytes_per_sec": 0 00:18:31.605 }, 00:18:31.605 "claimed": true, 00:18:31.605 "claim_type": "exclusive_write", 00:18:31.605 "zoned": false, 00:18:31.605 "supported_io_types": { 00:18:31.605 "read": true, 00:18:31.605 "write": true, 00:18:31.605 "unmap": true, 00:18:31.605 "write_zeroes": true, 00:18:31.605 "flush": true, 00:18:31.605 "reset": true, 00:18:31.605 "compare": false, 00:18:31.605 "compare_and_write": false, 00:18:31.605 "abort": true, 00:18:31.605 "nvme_admin": false, 00:18:31.605 "nvme_io": false 00:18:31.605 }, 00:18:31.605 "memory_domains": [ 00:18:31.605 { 00:18:31.605 "dma_device_id": "system", 00:18:31.605 "dma_device_type": 1 00:18:31.605 }, 00:18:31.605 { 00:18:31.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.605 "dma_device_type": 2 00:18:31.605 } 00:18:31.605 ], 00:18:31.605 "driver_specific": {} 00:18:31.605 }' 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:31.605 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:31.863 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:32.122 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:32.122 "name": "BaseBdev3", 00:18:32.122 "aliases": [ 00:18:32.122 "42f61830-092a-45b4-a6e9-b5d2666638ed" 00:18:32.122 ], 00:18:32.122 "product_name": "Malloc disk", 00:18:32.122 "block_size": 512, 00:18:32.122 "num_blocks": 65536, 00:18:32.122 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:32.122 "assigned_rate_limits": { 00:18:32.122 "rw_ios_per_sec": 0, 00:18:32.122 "rw_mbytes_per_sec": 0, 00:18:32.122 "r_mbytes_per_sec": 0, 00:18:32.122 "w_mbytes_per_sec": 0 00:18:32.122 }, 00:18:32.122 "claimed": true, 00:18:32.122 "claim_type": "exclusive_write", 00:18:32.122 "zoned": false, 00:18:32.122 "supported_io_types": { 00:18:32.122 "read": true, 00:18:32.122 "write": true, 00:18:32.122 "unmap": true, 00:18:32.122 "write_zeroes": true, 00:18:32.122 "flush": true, 00:18:32.122 "reset": true, 00:18:32.122 "compare": false, 00:18:32.122 "compare_and_write": false, 00:18:32.122 "abort": true, 00:18:32.122 "nvme_admin": false, 00:18:32.122 "nvme_io": false 00:18:32.122 }, 00:18:32.122 "memory_domains": [ 00:18:32.122 { 00:18:32.122 "dma_device_id": "system", 00:18:32.122 "dma_device_type": 1 00:18:32.122 }, 00:18:32.122 { 00:18:32.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.122 "dma_device_type": 2 00:18:32.122 } 00:18:32.122 ], 00:18:32.122 "driver_specific": {} 00:18:32.122 }' 00:18:32.122 11:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:32.122 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:32.381 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:32.640 "name": "BaseBdev4", 00:18:32.640 "aliases": [ 00:18:32.640 "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1" 00:18:32.640 ], 00:18:32.640 "product_name": "Malloc disk", 00:18:32.640 "block_size": 512, 00:18:32.640 "num_blocks": 65536, 00:18:32.640 "uuid": "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1", 00:18:32.640 "assigned_rate_limits": { 00:18:32.640 "rw_ios_per_sec": 0, 00:18:32.640 "rw_mbytes_per_sec": 0, 00:18:32.640 "r_mbytes_per_sec": 0, 00:18:32.640 "w_mbytes_per_sec": 0 00:18:32.640 }, 00:18:32.640 "claimed": true, 00:18:32.640 "claim_type": "exclusive_write", 00:18:32.640 "zoned": false, 00:18:32.640 "supported_io_types": { 00:18:32.640 "read": true, 00:18:32.640 "write": true, 00:18:32.640 "unmap": true, 00:18:32.640 "write_zeroes": true, 00:18:32.640 "flush": true, 00:18:32.640 "reset": true, 00:18:32.640 "compare": false, 00:18:32.640 "compare_and_write": false, 00:18:32.640 "abort": true, 00:18:32.640 "nvme_admin": false, 00:18:32.640 "nvme_io": false 00:18:32.640 }, 00:18:32.640 "memory_domains": [ 00:18:32.640 { 00:18:32.640 "dma_device_id": "system", 00:18:32.640 "dma_device_type": 1 00:18:32.640 }, 00:18:32.640 { 00:18:32.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.640 "dma_device_type": 2 00:18:32.640 } 00:18:32.640 ], 00:18:32.640 "driver_specific": {} 00:18:32.640 }' 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.640 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:32.899 11:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:33.158 [2024-05-14 11:55:00.045314] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:33.158 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:33.159 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:33.159 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:33.159 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.159 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.417 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:33.417 "name": "Existed_Raid", 00:18:33.417 "uuid": "3dbcf786-62f7-4d59-a9da-9e964bb293f9", 00:18:33.417 "strip_size_kb": 0, 00:18:33.417 "state": "online", 00:18:33.417 "raid_level": "raid1", 00:18:33.417 "superblock": true, 00:18:33.417 "num_base_bdevs": 4, 00:18:33.417 "num_base_bdevs_discovered": 3, 00:18:33.417 "num_base_bdevs_operational": 3, 00:18:33.417 "base_bdevs_list": [ 00:18:33.417 { 00:18:33.417 "name": null, 00:18:33.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.417 "is_configured": false, 00:18:33.417 "data_offset": 2048, 00:18:33.417 "data_size": 63488 00:18:33.417 }, 00:18:33.417 { 00:18:33.417 "name": "BaseBdev2", 00:18:33.417 "uuid": "8983f020-5010-4d97-8ead-f10094e9a6e3", 00:18:33.417 "is_configured": true, 00:18:33.417 "data_offset": 2048, 00:18:33.417 "data_size": 63488 00:18:33.417 }, 00:18:33.417 { 00:18:33.417 "name": "BaseBdev3", 00:18:33.417 "uuid": "42f61830-092a-45b4-a6e9-b5d2666638ed", 00:18:33.417 "is_configured": true, 00:18:33.417 "data_offset": 2048, 00:18:33.417 "data_size": 63488 00:18:33.417 }, 00:18:33.417 { 00:18:33.417 "name": "BaseBdev4", 00:18:33.417 "uuid": "3fc9c3cb-1a5f-4946-8aed-d5c744dd9ea1", 00:18:33.417 "is_configured": true, 00:18:33.417 "data_offset": 2048, 00:18:33.417 "data_size": 63488 00:18:33.417 } 00:18:33.417 ] 00:18:33.417 }' 00:18:33.417 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:33.417 11:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:33.985 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:18:33.985 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:33.985 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:33.985 11:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.985 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:33.985 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:33.985 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:34.245 [2024-05-14 11:55:01.285690] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:34.245 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:34.245 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:34.245 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.245 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:34.504 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:34.504 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:34.504 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:34.763 [2024-05-14 11:55:01.779451] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:34.763 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:34.763 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:34.763 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.763 11:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:18:35.022 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:18:35.022 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:35.022 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:35.281 [2024-05-14 11:55:02.277210] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:35.281 [2024-05-14 11:55:02.277284] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:35.281 [2024-05-14 11:55:02.288117] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:35.281 [2024-05-14 11:55:02.288179] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:35.281 [2024-05-14 11:55:02.288191] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8241b0 name Existed_Raid, state offline 00:18:35.281 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:18:35.281 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:18:35.281 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.281 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:35.540 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:35.799 BaseBdev2 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:35.799 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.059 11:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:36.059 [ 00:18:36.059 { 00:18:36.059 "name": "BaseBdev2", 00:18:36.059 "aliases": [ 00:18:36.059 "9081b16c-c250-4976-82a9-b1de7704c911" 00:18:36.059 ], 00:18:36.059 "product_name": "Malloc disk", 00:18:36.059 "block_size": 512, 00:18:36.059 "num_blocks": 65536, 00:18:36.059 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:36.059 "assigned_rate_limits": { 00:18:36.059 "rw_ios_per_sec": 0, 00:18:36.059 "rw_mbytes_per_sec": 0, 00:18:36.059 "r_mbytes_per_sec": 0, 00:18:36.059 "w_mbytes_per_sec": 0 00:18:36.059 }, 00:18:36.059 "claimed": false, 00:18:36.059 "zoned": false, 00:18:36.059 "supported_io_types": { 00:18:36.059 "read": true, 00:18:36.059 "write": true, 00:18:36.059 "unmap": true, 00:18:36.059 "write_zeroes": true, 00:18:36.059 "flush": true, 00:18:36.059 "reset": true, 00:18:36.059 "compare": false, 00:18:36.059 "compare_and_write": false, 00:18:36.059 "abort": true, 00:18:36.059 "nvme_admin": false, 00:18:36.059 "nvme_io": false 00:18:36.059 }, 00:18:36.059 "memory_domains": [ 00:18:36.059 { 00:18:36.059 "dma_device_id": "system", 00:18:36.059 "dma_device_type": 1 00:18:36.059 }, 00:18:36.059 { 00:18:36.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.059 "dma_device_type": 2 00:18:36.059 } 00:18:36.059 ], 00:18:36.059 "driver_specific": {} 00:18:36.059 } 00:18:36.059 ] 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:36.318 BaseBdev3 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:36.318 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.575 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:36.856 [ 00:18:36.856 { 00:18:36.856 "name": "BaseBdev3", 00:18:36.856 "aliases": [ 00:18:36.856 "795e62f2-867e-4f1d-af7b-406d2e2ea92a" 00:18:36.856 ], 00:18:36.856 "product_name": "Malloc disk", 00:18:36.856 "block_size": 512, 00:18:36.856 "num_blocks": 65536, 00:18:36.856 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:36.856 "assigned_rate_limits": { 00:18:36.856 "rw_ios_per_sec": 0, 00:18:36.856 "rw_mbytes_per_sec": 0, 00:18:36.856 "r_mbytes_per_sec": 0, 00:18:36.856 "w_mbytes_per_sec": 0 00:18:36.856 }, 00:18:36.856 "claimed": false, 00:18:36.856 "zoned": false, 00:18:36.856 "supported_io_types": { 00:18:36.856 "read": true, 00:18:36.856 "write": true, 00:18:36.856 "unmap": true, 00:18:36.856 "write_zeroes": true, 00:18:36.856 "flush": true, 00:18:36.856 "reset": true, 00:18:36.856 "compare": false, 00:18:36.856 "compare_and_write": false, 00:18:36.856 "abort": true, 00:18:36.856 "nvme_admin": false, 00:18:36.856 "nvme_io": false 00:18:36.856 }, 00:18:36.856 "memory_domains": [ 00:18:36.856 { 00:18:36.856 "dma_device_id": "system", 00:18:36.856 "dma_device_type": 1 00:18:36.856 }, 00:18:36.856 { 00:18:36.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.856 "dma_device_type": 2 00:18:36.856 } 00:18:36.856 ], 00:18:36.856 "driver_specific": {} 00:18:36.856 } 00:18:36.856 ] 00:18:36.856 11:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:36.856 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:36.856 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:36.856 11:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:37.114 BaseBdev4 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:37.114 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:37.371 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:37.628 [ 00:18:37.628 { 00:18:37.628 "name": "BaseBdev4", 00:18:37.628 "aliases": [ 00:18:37.628 "f3d8922d-46fa-45fc-b7d7-58adffc74284" 00:18:37.628 ], 00:18:37.628 "product_name": "Malloc disk", 00:18:37.628 "block_size": 512, 00:18:37.628 "num_blocks": 65536, 00:18:37.628 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:37.628 "assigned_rate_limits": { 00:18:37.628 "rw_ios_per_sec": 0, 00:18:37.628 "rw_mbytes_per_sec": 0, 00:18:37.628 "r_mbytes_per_sec": 0, 00:18:37.628 "w_mbytes_per_sec": 0 00:18:37.628 }, 00:18:37.628 "claimed": false, 00:18:37.628 "zoned": false, 00:18:37.628 "supported_io_types": { 00:18:37.628 "read": true, 00:18:37.628 "write": true, 00:18:37.628 "unmap": true, 00:18:37.628 "write_zeroes": true, 00:18:37.628 "flush": true, 00:18:37.628 "reset": true, 00:18:37.628 "compare": false, 00:18:37.628 "compare_and_write": false, 00:18:37.628 "abort": true, 00:18:37.628 "nvme_admin": false, 00:18:37.628 "nvme_io": false 00:18:37.629 }, 00:18:37.629 "memory_domains": [ 00:18:37.629 { 00:18:37.629 "dma_device_id": "system", 00:18:37.629 "dma_device_type": 1 00:18:37.629 }, 00:18:37.629 { 00:18:37.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.629 "dma_device_type": 2 00:18:37.629 } 00:18:37.629 ], 00:18:37.629 "driver_specific": {} 00:18:37.629 } 00:18:37.629 ] 00:18:37.629 11:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:37.629 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:18:37.629 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:18:37.629 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:37.886 [2024-05-14 11:55:04.817190] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:37.886 [2024-05-14 11:55:04.817231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:37.886 [2024-05-14 11:55:04.817250] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.886 [2024-05-14 11:55:04.818621] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:37.886 [2024-05-14 11:55:04.818663] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.886 11:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.144 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:38.144 "name": "Existed_Raid", 00:18:38.144 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:38.144 "strip_size_kb": 0, 00:18:38.144 "state": "configuring", 00:18:38.144 "raid_level": "raid1", 00:18:38.144 "superblock": true, 00:18:38.144 "num_base_bdevs": 4, 00:18:38.144 "num_base_bdevs_discovered": 3, 00:18:38.144 "num_base_bdevs_operational": 4, 00:18:38.144 "base_bdevs_list": [ 00:18:38.144 { 00:18:38.144 "name": "BaseBdev1", 00:18:38.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.144 "is_configured": false, 00:18:38.144 "data_offset": 0, 00:18:38.144 "data_size": 0 00:18:38.144 }, 00:18:38.144 { 00:18:38.144 "name": "BaseBdev2", 00:18:38.144 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:38.144 "is_configured": true, 00:18:38.144 "data_offset": 2048, 00:18:38.144 "data_size": 63488 00:18:38.144 }, 00:18:38.144 { 00:18:38.144 "name": "BaseBdev3", 00:18:38.144 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:38.144 "is_configured": true, 00:18:38.144 "data_offset": 2048, 00:18:38.144 "data_size": 63488 00:18:38.144 }, 00:18:38.144 { 00:18:38.144 "name": "BaseBdev4", 00:18:38.144 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:38.144 "is_configured": true, 00:18:38.144 "data_offset": 2048, 00:18:38.144 "data_size": 63488 00:18:38.144 } 00:18:38.144 ] 00:18:38.144 }' 00:18:38.144 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:38.144 11:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:38.709 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:38.967 [2024-05-14 11:55:05.823836] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.967 11:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:39.225 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:39.225 "name": "Existed_Raid", 00:18:39.225 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:39.225 "strip_size_kb": 0, 00:18:39.225 "state": "configuring", 00:18:39.225 "raid_level": "raid1", 00:18:39.225 "superblock": true, 00:18:39.225 "num_base_bdevs": 4, 00:18:39.225 "num_base_bdevs_discovered": 2, 00:18:39.225 "num_base_bdevs_operational": 4, 00:18:39.225 "base_bdevs_list": [ 00:18:39.225 { 00:18:39.225 "name": "BaseBdev1", 00:18:39.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.225 "is_configured": false, 00:18:39.225 "data_offset": 0, 00:18:39.225 "data_size": 0 00:18:39.225 }, 00:18:39.225 { 00:18:39.225 "name": null, 00:18:39.225 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:39.225 "is_configured": false, 00:18:39.225 "data_offset": 2048, 00:18:39.225 "data_size": 63488 00:18:39.225 }, 00:18:39.225 { 00:18:39.225 "name": "BaseBdev3", 00:18:39.225 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:39.225 "is_configured": true, 00:18:39.225 "data_offset": 2048, 00:18:39.225 "data_size": 63488 00:18:39.225 }, 00:18:39.225 { 00:18:39.225 "name": "BaseBdev4", 00:18:39.225 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:39.225 "is_configured": true, 00:18:39.225 "data_offset": 2048, 00:18:39.225 "data_size": 63488 00:18:39.225 } 00:18:39.225 ] 00:18:39.225 }' 00:18:39.225 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:39.225 11:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.792 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.792 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:40.050 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:18:40.050 11:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:40.309 [2024-05-14 11:55:07.139849] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:40.309 BaseBdev1 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:40.309 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.567 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:40.567 [ 00:18:40.567 { 00:18:40.567 "name": "BaseBdev1", 00:18:40.567 "aliases": [ 00:18:40.567 "be0f8301-39c7-4988-9fe3-621769968acf" 00:18:40.567 ], 00:18:40.567 "product_name": "Malloc disk", 00:18:40.567 "block_size": 512, 00:18:40.567 "num_blocks": 65536, 00:18:40.568 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:40.568 "assigned_rate_limits": { 00:18:40.568 "rw_ios_per_sec": 0, 00:18:40.568 "rw_mbytes_per_sec": 0, 00:18:40.568 "r_mbytes_per_sec": 0, 00:18:40.568 "w_mbytes_per_sec": 0 00:18:40.568 }, 00:18:40.568 "claimed": true, 00:18:40.568 "claim_type": "exclusive_write", 00:18:40.568 "zoned": false, 00:18:40.568 "supported_io_types": { 00:18:40.568 "read": true, 00:18:40.568 "write": true, 00:18:40.568 "unmap": true, 00:18:40.568 "write_zeroes": true, 00:18:40.568 "flush": true, 00:18:40.568 "reset": true, 00:18:40.568 "compare": false, 00:18:40.568 "compare_and_write": false, 00:18:40.568 "abort": true, 00:18:40.568 "nvme_admin": false, 00:18:40.568 "nvme_io": false 00:18:40.568 }, 00:18:40.568 "memory_domains": [ 00:18:40.568 { 00:18:40.568 "dma_device_id": "system", 00:18:40.568 "dma_device_type": 1 00:18:40.568 }, 00:18:40.568 { 00:18:40.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.568 "dma_device_type": 2 00:18:40.568 } 00:18:40.568 ], 00:18:40.568 "driver_specific": {} 00:18:40.568 } 00:18:40.568 ] 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.568 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.826 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:40.826 "name": "Existed_Raid", 00:18:40.826 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:40.826 "strip_size_kb": 0, 00:18:40.826 "state": "configuring", 00:18:40.826 "raid_level": "raid1", 00:18:40.826 "superblock": true, 00:18:40.826 "num_base_bdevs": 4, 00:18:40.826 "num_base_bdevs_discovered": 3, 00:18:40.826 "num_base_bdevs_operational": 4, 00:18:40.826 "base_bdevs_list": [ 00:18:40.826 { 00:18:40.826 "name": "BaseBdev1", 00:18:40.826 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:40.826 "is_configured": true, 00:18:40.826 "data_offset": 2048, 00:18:40.826 "data_size": 63488 00:18:40.826 }, 00:18:40.826 { 00:18:40.826 "name": null, 00:18:40.826 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:40.826 "is_configured": false, 00:18:40.826 "data_offset": 2048, 00:18:40.826 "data_size": 63488 00:18:40.826 }, 00:18:40.826 { 00:18:40.826 "name": "BaseBdev3", 00:18:40.826 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:40.827 "is_configured": true, 00:18:40.827 "data_offset": 2048, 00:18:40.827 "data_size": 63488 00:18:40.827 }, 00:18:40.827 { 00:18:40.827 "name": "BaseBdev4", 00:18:40.827 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:40.827 "is_configured": true, 00:18:40.827 "data_offset": 2048, 00:18:40.827 "data_size": 63488 00:18:40.827 } 00:18:40.827 ] 00:18:40.827 }' 00:18:40.827 11:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:40.827 11:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.762 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.762 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:41.762 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:18:41.762 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:42.021 [2024-05-14 11:55:08.876485] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.021 11:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.281 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:42.281 "name": "Existed_Raid", 00:18:42.281 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:42.281 "strip_size_kb": 0, 00:18:42.281 "state": "configuring", 00:18:42.281 "raid_level": "raid1", 00:18:42.281 "superblock": true, 00:18:42.281 "num_base_bdevs": 4, 00:18:42.281 "num_base_bdevs_discovered": 2, 00:18:42.281 "num_base_bdevs_operational": 4, 00:18:42.281 "base_bdevs_list": [ 00:18:42.281 { 00:18:42.281 "name": "BaseBdev1", 00:18:42.281 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:42.281 "is_configured": true, 00:18:42.281 "data_offset": 2048, 00:18:42.281 "data_size": 63488 00:18:42.281 }, 00:18:42.281 { 00:18:42.281 "name": null, 00:18:42.281 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:42.281 "is_configured": false, 00:18:42.281 "data_offset": 2048, 00:18:42.281 "data_size": 63488 00:18:42.281 }, 00:18:42.281 { 00:18:42.281 "name": null, 00:18:42.281 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:42.281 "is_configured": false, 00:18:42.281 "data_offset": 2048, 00:18:42.281 "data_size": 63488 00:18:42.281 }, 00:18:42.281 { 00:18:42.281 "name": "BaseBdev4", 00:18:42.281 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:42.281 "is_configured": true, 00:18:42.281 "data_offset": 2048, 00:18:42.281 "data_size": 63488 00:18:42.281 } 00:18:42.281 ] 00:18:42.281 }' 00:18:42.281 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:42.281 11:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.847 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:42.847 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.107 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:18:43.107 11:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:43.107 [2024-05-14 11:55:10.183962] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.367 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.626 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:43.626 "name": "Existed_Raid", 00:18:43.626 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:43.626 "strip_size_kb": 0, 00:18:43.626 "state": "configuring", 00:18:43.626 "raid_level": "raid1", 00:18:43.626 "superblock": true, 00:18:43.626 "num_base_bdevs": 4, 00:18:43.626 "num_base_bdevs_discovered": 3, 00:18:43.626 "num_base_bdevs_operational": 4, 00:18:43.626 "base_bdevs_list": [ 00:18:43.626 { 00:18:43.626 "name": "BaseBdev1", 00:18:43.626 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:43.626 "is_configured": true, 00:18:43.626 "data_offset": 2048, 00:18:43.626 "data_size": 63488 00:18:43.626 }, 00:18:43.626 { 00:18:43.626 "name": null, 00:18:43.626 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:43.626 "is_configured": false, 00:18:43.626 "data_offset": 2048, 00:18:43.626 "data_size": 63488 00:18:43.626 }, 00:18:43.626 { 00:18:43.627 "name": "BaseBdev3", 00:18:43.627 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:43.627 "is_configured": true, 00:18:43.627 "data_offset": 2048, 00:18:43.627 "data_size": 63488 00:18:43.627 }, 00:18:43.627 { 00:18:43.627 "name": "BaseBdev4", 00:18:43.627 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:43.627 "is_configured": true, 00:18:43.627 "data_offset": 2048, 00:18:43.627 "data_size": 63488 00:18:43.627 } 00:18:43.627 ] 00:18:43.627 }' 00:18:43.627 11:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:43.627 11:55:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.195 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.195 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:44.454 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:18:44.454 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:44.454 [2024-05-14 11:55:11.515512] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:44.713 "name": "Existed_Raid", 00:18:44.713 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:44.713 "strip_size_kb": 0, 00:18:44.713 "state": "configuring", 00:18:44.713 "raid_level": "raid1", 00:18:44.713 "superblock": true, 00:18:44.713 "num_base_bdevs": 4, 00:18:44.713 "num_base_bdevs_discovered": 2, 00:18:44.713 "num_base_bdevs_operational": 4, 00:18:44.713 "base_bdevs_list": [ 00:18:44.713 { 00:18:44.713 "name": null, 00:18:44.713 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:44.713 "is_configured": false, 00:18:44.713 "data_offset": 2048, 00:18:44.713 "data_size": 63488 00:18:44.713 }, 00:18:44.713 { 00:18:44.713 "name": null, 00:18:44.713 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:44.713 "is_configured": false, 00:18:44.713 "data_offset": 2048, 00:18:44.713 "data_size": 63488 00:18:44.713 }, 00:18:44.713 { 00:18:44.713 "name": "BaseBdev3", 00:18:44.713 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:44.713 "is_configured": true, 00:18:44.713 "data_offset": 2048, 00:18:44.713 "data_size": 63488 00:18:44.713 }, 00:18:44.713 { 00:18:44.713 "name": "BaseBdev4", 00:18:44.713 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:44.713 "is_configured": true, 00:18:44.713 "data_offset": 2048, 00:18:44.713 "data_size": 63488 00:18:44.713 } 00:18:44.713 ] 00:18:44.713 }' 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:44.713 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:45.651 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.651 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:45.651 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:18:45.651 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:45.911 [2024-05-14 11:55:12.781478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.911 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.170 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:46.170 "name": "Existed_Raid", 00:18:46.170 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:46.170 "strip_size_kb": 0, 00:18:46.170 "state": "configuring", 00:18:46.170 "raid_level": "raid1", 00:18:46.170 "superblock": true, 00:18:46.170 "num_base_bdevs": 4, 00:18:46.170 "num_base_bdevs_discovered": 3, 00:18:46.170 "num_base_bdevs_operational": 4, 00:18:46.170 "base_bdevs_list": [ 00:18:46.170 { 00:18:46.170 "name": null, 00:18:46.170 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:46.170 "is_configured": false, 00:18:46.170 "data_offset": 2048, 00:18:46.170 "data_size": 63488 00:18:46.170 }, 00:18:46.170 { 00:18:46.170 "name": "BaseBdev2", 00:18:46.170 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:46.170 "is_configured": true, 00:18:46.170 "data_offset": 2048, 00:18:46.170 "data_size": 63488 00:18:46.170 }, 00:18:46.170 { 00:18:46.170 "name": "BaseBdev3", 00:18:46.170 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:46.170 "is_configured": true, 00:18:46.170 "data_offset": 2048, 00:18:46.170 "data_size": 63488 00:18:46.170 }, 00:18:46.170 { 00:18:46.170 "name": "BaseBdev4", 00:18:46.170 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:46.170 "is_configured": true, 00:18:46.170 "data_offset": 2048, 00:18:46.170 "data_size": 63488 00:18:46.170 } 00:18:46.170 ] 00:18:46.170 }' 00:18:46.170 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:46.170 11:55:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:46.742 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:46.742 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.742 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:18:46.742 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:46.742 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.002 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u be0f8301-39c7-4988-9fe3-621769968acf 00:18:47.261 [2024-05-14 11:55:14.253916] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:47.261 [2024-05-14 11:55:14.254082] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x9d5920 00:18:47.261 [2024-05-14 11:55:14.254094] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:47.261 [2024-05-14 11:55:14.254285] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x73ddf0 00:18:47.261 [2024-05-14 11:55:14.254427] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9d5920 00:18:47.261 [2024-05-14 11:55:14.254438] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9d5920 00:18:47.261 [2024-05-14 11:55:14.254534] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:47.261 NewBaseBdev 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:18:47.261 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:47.520 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:47.780 [ 00:18:47.780 { 00:18:47.780 "name": "NewBaseBdev", 00:18:47.780 "aliases": [ 00:18:47.780 "be0f8301-39c7-4988-9fe3-621769968acf" 00:18:47.780 ], 00:18:47.780 "product_name": "Malloc disk", 00:18:47.780 "block_size": 512, 00:18:47.780 "num_blocks": 65536, 00:18:47.780 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:47.780 "assigned_rate_limits": { 00:18:47.780 "rw_ios_per_sec": 0, 00:18:47.780 "rw_mbytes_per_sec": 0, 00:18:47.780 "r_mbytes_per_sec": 0, 00:18:47.780 "w_mbytes_per_sec": 0 00:18:47.780 }, 00:18:47.780 "claimed": true, 00:18:47.780 "claim_type": "exclusive_write", 00:18:47.780 "zoned": false, 00:18:47.780 "supported_io_types": { 00:18:47.780 "read": true, 00:18:47.780 "write": true, 00:18:47.780 "unmap": true, 00:18:47.780 "write_zeroes": true, 00:18:47.780 "flush": true, 00:18:47.780 "reset": true, 00:18:47.780 "compare": false, 00:18:47.780 "compare_and_write": false, 00:18:47.780 "abort": true, 00:18:47.780 "nvme_admin": false, 00:18:47.780 "nvme_io": false 00:18:47.780 }, 00:18:47.780 "memory_domains": [ 00:18:47.780 { 00:18:47.780 "dma_device_id": "system", 00:18:47.780 "dma_device_type": 1 00:18:47.780 }, 00:18:47.780 { 00:18:47.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.780 "dma_device_type": 2 00:18:47.780 } 00:18:47.780 ], 00:18:47.780 "driver_specific": {} 00:18:47.780 } 00:18:47.780 ] 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.780 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.039 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:48.039 "name": "Existed_Raid", 00:18:48.039 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:48.039 "strip_size_kb": 0, 00:18:48.039 "state": "online", 00:18:48.039 "raid_level": "raid1", 00:18:48.039 "superblock": true, 00:18:48.039 "num_base_bdevs": 4, 00:18:48.039 "num_base_bdevs_discovered": 4, 00:18:48.039 "num_base_bdevs_operational": 4, 00:18:48.039 "base_bdevs_list": [ 00:18:48.039 { 00:18:48.039 "name": "NewBaseBdev", 00:18:48.039 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:48.039 "is_configured": true, 00:18:48.039 "data_offset": 2048, 00:18:48.039 "data_size": 63488 00:18:48.039 }, 00:18:48.039 { 00:18:48.039 "name": "BaseBdev2", 00:18:48.039 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:48.039 "is_configured": true, 00:18:48.039 "data_offset": 2048, 00:18:48.039 "data_size": 63488 00:18:48.039 }, 00:18:48.039 { 00:18:48.039 "name": "BaseBdev3", 00:18:48.039 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:48.039 "is_configured": true, 00:18:48.039 "data_offset": 2048, 00:18:48.039 "data_size": 63488 00:18:48.039 }, 00:18:48.039 { 00:18:48.039 "name": "BaseBdev4", 00:18:48.039 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:48.039 "is_configured": true, 00:18:48.039 "data_offset": 2048, 00:18:48.039 "data_size": 63488 00:18:48.039 } 00:18:48.039 ] 00:18:48.039 }' 00:18:48.039 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:48.039 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:48.608 [2024-05-14 11:55:15.629891] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:48.608 "name": "Existed_Raid", 00:18:48.608 "aliases": [ 00:18:48.608 "7687af3b-d0ef-4028-9b8a-5adc962d9561" 00:18:48.608 ], 00:18:48.608 "product_name": "Raid Volume", 00:18:48.608 "block_size": 512, 00:18:48.608 "num_blocks": 63488, 00:18:48.608 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:48.608 "assigned_rate_limits": { 00:18:48.608 "rw_ios_per_sec": 0, 00:18:48.608 "rw_mbytes_per_sec": 0, 00:18:48.608 "r_mbytes_per_sec": 0, 00:18:48.608 "w_mbytes_per_sec": 0 00:18:48.608 }, 00:18:48.608 "claimed": false, 00:18:48.608 "zoned": false, 00:18:48.608 "supported_io_types": { 00:18:48.608 "read": true, 00:18:48.608 "write": true, 00:18:48.608 "unmap": false, 00:18:48.608 "write_zeroes": true, 00:18:48.608 "flush": false, 00:18:48.608 "reset": true, 00:18:48.608 "compare": false, 00:18:48.608 "compare_and_write": false, 00:18:48.608 "abort": false, 00:18:48.608 "nvme_admin": false, 00:18:48.608 "nvme_io": false 00:18:48.608 }, 00:18:48.608 "memory_domains": [ 00:18:48.608 { 00:18:48.608 "dma_device_id": "system", 00:18:48.608 "dma_device_type": 1 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.608 "dma_device_type": 2 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "system", 00:18:48.608 "dma_device_type": 1 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.608 "dma_device_type": 2 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "system", 00:18:48.608 "dma_device_type": 1 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.608 "dma_device_type": 2 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "system", 00:18:48.608 "dma_device_type": 1 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.608 "dma_device_type": 2 00:18:48.608 } 00:18:48.608 ], 00:18:48.608 "driver_specific": { 00:18:48.608 "raid": { 00:18:48.608 "uuid": "7687af3b-d0ef-4028-9b8a-5adc962d9561", 00:18:48.608 "strip_size_kb": 0, 00:18:48.608 "state": "online", 00:18:48.608 "raid_level": "raid1", 00:18:48.608 "superblock": true, 00:18:48.608 "num_base_bdevs": 4, 00:18:48.608 "num_base_bdevs_discovered": 4, 00:18:48.608 "num_base_bdevs_operational": 4, 00:18:48.608 "base_bdevs_list": [ 00:18:48.608 { 00:18:48.608 "name": "NewBaseBdev", 00:18:48.608 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:48.608 "is_configured": true, 00:18:48.608 "data_offset": 2048, 00:18:48.608 "data_size": 63488 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "name": "BaseBdev2", 00:18:48.608 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:48.608 "is_configured": true, 00:18:48.608 "data_offset": 2048, 00:18:48.608 "data_size": 63488 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "name": "BaseBdev3", 00:18:48.608 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:48.608 "is_configured": true, 00:18:48.608 "data_offset": 2048, 00:18:48.608 "data_size": 63488 00:18:48.608 }, 00:18:48.608 { 00:18:48.608 "name": "BaseBdev4", 00:18:48.608 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:48.608 "is_configured": true, 00:18:48.608 "data_offset": 2048, 00:18:48.608 "data_size": 63488 00:18:48.608 } 00:18:48.608 ] 00:18:48.608 } 00:18:48.608 } 00:18:48.608 }' 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:18:48.608 BaseBdev2 00:18:48.608 BaseBdev3 00:18:48.608 BaseBdev4' 00:18:48.608 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:48.867 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:48.867 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:48.867 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:48.867 "name": "NewBaseBdev", 00:18:48.867 "aliases": [ 00:18:48.867 "be0f8301-39c7-4988-9fe3-621769968acf" 00:18:48.867 ], 00:18:48.867 "product_name": "Malloc disk", 00:18:48.867 "block_size": 512, 00:18:48.867 "num_blocks": 65536, 00:18:48.867 "uuid": "be0f8301-39c7-4988-9fe3-621769968acf", 00:18:48.867 "assigned_rate_limits": { 00:18:48.867 "rw_ios_per_sec": 0, 00:18:48.867 "rw_mbytes_per_sec": 0, 00:18:48.867 "r_mbytes_per_sec": 0, 00:18:48.867 "w_mbytes_per_sec": 0 00:18:48.867 }, 00:18:48.867 "claimed": true, 00:18:48.867 "claim_type": "exclusive_write", 00:18:48.867 "zoned": false, 00:18:48.867 "supported_io_types": { 00:18:48.867 "read": true, 00:18:48.867 "write": true, 00:18:48.867 "unmap": true, 00:18:48.867 "write_zeroes": true, 00:18:48.867 "flush": true, 00:18:48.867 "reset": true, 00:18:48.867 "compare": false, 00:18:48.867 "compare_and_write": false, 00:18:48.867 "abort": true, 00:18:48.867 "nvme_admin": false, 00:18:48.867 "nvme_io": false 00:18:48.867 }, 00:18:48.867 "memory_domains": [ 00:18:48.867 { 00:18:48.867 "dma_device_id": "system", 00:18:48.867 "dma_device_type": 1 00:18:48.867 }, 00:18:48.867 { 00:18:48.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.867 "dma_device_type": 2 00:18:48.867 } 00:18:48.867 ], 00:18:48.867 "driver_specific": {} 00:18:48.867 }' 00:18:48.867 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:49.126 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:49.126 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:49.385 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:49.644 "name": "BaseBdev2", 00:18:49.644 "aliases": [ 00:18:49.644 "9081b16c-c250-4976-82a9-b1de7704c911" 00:18:49.644 ], 00:18:49.644 "product_name": "Malloc disk", 00:18:49.644 "block_size": 512, 00:18:49.644 "num_blocks": 65536, 00:18:49.644 "uuid": "9081b16c-c250-4976-82a9-b1de7704c911", 00:18:49.644 "assigned_rate_limits": { 00:18:49.644 "rw_ios_per_sec": 0, 00:18:49.644 "rw_mbytes_per_sec": 0, 00:18:49.644 "r_mbytes_per_sec": 0, 00:18:49.644 "w_mbytes_per_sec": 0 00:18:49.644 }, 00:18:49.644 "claimed": true, 00:18:49.644 "claim_type": "exclusive_write", 00:18:49.644 "zoned": false, 00:18:49.644 "supported_io_types": { 00:18:49.644 "read": true, 00:18:49.644 "write": true, 00:18:49.644 "unmap": true, 00:18:49.644 "write_zeroes": true, 00:18:49.644 "flush": true, 00:18:49.644 "reset": true, 00:18:49.644 "compare": false, 00:18:49.644 "compare_and_write": false, 00:18:49.644 "abort": true, 00:18:49.644 "nvme_admin": false, 00:18:49.644 "nvme_io": false 00:18:49.644 }, 00:18:49.644 "memory_domains": [ 00:18:49.644 { 00:18:49.644 "dma_device_id": "system", 00:18:49.644 "dma_device_type": 1 00:18:49.644 }, 00:18:49.644 { 00:18:49.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.644 "dma_device_type": 2 00:18:49.644 } 00:18:49.644 ], 00:18:49.644 "driver_specific": {} 00:18:49.644 }' 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.644 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:49.903 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:50.163 "name": "BaseBdev3", 00:18:50.163 "aliases": [ 00:18:50.163 "795e62f2-867e-4f1d-af7b-406d2e2ea92a" 00:18:50.163 ], 00:18:50.163 "product_name": "Malloc disk", 00:18:50.163 "block_size": 512, 00:18:50.163 "num_blocks": 65536, 00:18:50.163 "uuid": "795e62f2-867e-4f1d-af7b-406d2e2ea92a", 00:18:50.163 "assigned_rate_limits": { 00:18:50.163 "rw_ios_per_sec": 0, 00:18:50.163 "rw_mbytes_per_sec": 0, 00:18:50.163 "r_mbytes_per_sec": 0, 00:18:50.163 "w_mbytes_per_sec": 0 00:18:50.163 }, 00:18:50.163 "claimed": true, 00:18:50.163 "claim_type": "exclusive_write", 00:18:50.163 "zoned": false, 00:18:50.163 "supported_io_types": { 00:18:50.163 "read": true, 00:18:50.163 "write": true, 00:18:50.163 "unmap": true, 00:18:50.163 "write_zeroes": true, 00:18:50.163 "flush": true, 00:18:50.163 "reset": true, 00:18:50.163 "compare": false, 00:18:50.163 "compare_and_write": false, 00:18:50.163 "abort": true, 00:18:50.163 "nvme_admin": false, 00:18:50.163 "nvme_io": false 00:18:50.163 }, 00:18:50.163 "memory_domains": [ 00:18:50.163 { 00:18:50.163 "dma_device_id": "system", 00:18:50.163 "dma_device_type": 1 00:18:50.163 }, 00:18:50.163 { 00:18:50.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.163 "dma_device_type": 2 00:18:50.163 } 00:18:50.163 ], 00:18:50.163 "driver_specific": {} 00:18:50.163 }' 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:50.163 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:50.423 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:50.689 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:50.689 "name": "BaseBdev4", 00:18:50.689 "aliases": [ 00:18:50.689 "f3d8922d-46fa-45fc-b7d7-58adffc74284" 00:18:50.689 ], 00:18:50.689 "product_name": "Malloc disk", 00:18:50.689 "block_size": 512, 00:18:50.689 "num_blocks": 65536, 00:18:50.689 "uuid": "f3d8922d-46fa-45fc-b7d7-58adffc74284", 00:18:50.689 "assigned_rate_limits": { 00:18:50.689 "rw_ios_per_sec": 0, 00:18:50.689 "rw_mbytes_per_sec": 0, 00:18:50.689 "r_mbytes_per_sec": 0, 00:18:50.689 "w_mbytes_per_sec": 0 00:18:50.690 }, 00:18:50.690 "claimed": true, 00:18:50.690 "claim_type": "exclusive_write", 00:18:50.690 "zoned": false, 00:18:50.690 "supported_io_types": { 00:18:50.690 "read": true, 00:18:50.690 "write": true, 00:18:50.690 "unmap": true, 00:18:50.690 "write_zeroes": true, 00:18:50.690 "flush": true, 00:18:50.690 "reset": true, 00:18:50.690 "compare": false, 00:18:50.690 "compare_and_write": false, 00:18:50.690 "abort": true, 00:18:50.690 "nvme_admin": false, 00:18:50.690 "nvme_io": false 00:18:50.690 }, 00:18:50.690 "memory_domains": [ 00:18:50.690 { 00:18:50.690 "dma_device_id": "system", 00:18:50.690 "dma_device_type": 1 00:18:50.690 }, 00:18:50.690 { 00:18:50.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.690 "dma_device_type": 2 00:18:50.690 } 00:18:50.690 ], 00:18:50.690 "driver_specific": {} 00:18:50.690 }' 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.690 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:50.956 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:51.249 [2024-05-14 11:55:18.132309] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:51.249 [2024-05-14 11:55:18.132345] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:51.249 [2024-05-14 11:55:18.132421] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:51.249 [2024-05-14 11:55:18.132729] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:51.249 [2024-05-14 11:55:18.132743] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9d5920 name Existed_Raid, state offline 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 1741593 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1741593 ']' 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 1741593 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1741593 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1741593' 00:18:51.249 killing process with pid 1741593 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 1741593 00:18:51.249 [2024-05-14 11:55:18.203470] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:51.249 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 1741593 00:18:51.249 [2024-05-14 11:55:18.280499] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:51.819 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:18:51.819 00:18:51.819 real 0m31.260s 00:18:51.819 user 0m57.215s 00:18:51.819 sys 0m5.475s 00:18:51.819 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:51.819 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:51.819 ************************************ 00:18:51.819 END TEST raid_state_function_test_sb 00:18:51.819 ************************************ 00:18:51.819 11:55:18 bdev_raid -- bdev/bdev_raid.sh@817 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:18:51.819 11:55:18 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:18:51.819 11:55:18 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:51.819 11:55:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:51.819 ************************************ 00:18:51.819 START TEST raid_superblock_test 00:18:51.819 ************************************ 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 4 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=1746308 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 1746308 /var/tmp/spdk-raid.sock 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 1746308 ']' 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:51.819 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:51.819 11:55:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.819 [2024-05-14 11:55:18.834737] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:18:51.819 [2024-05-14 11:55:18.834801] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1746308 ] 00:18:52.078 [2024-05-14 11:55:18.962261] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:52.078 [2024-05-14 11:55:19.067170] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:52.078 [2024-05-14 11:55:19.137738] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:52.078 [2024-05-14 11:55:19.137774] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:53.016 11:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:53.016 malloc1 00:18:53.016 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:53.276 [2024-05-14 11:55:20.180004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:53.276 [2024-05-14 11:55:20.180053] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:53.276 [2024-05-14 11:55:20.180075] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24132a0 00:18:53.276 [2024-05-14 11:55:20.180088] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:53.276 [2024-05-14 11:55:20.181817] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:53.276 [2024-05-14 11:55:20.181846] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:53.276 pt1 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:53.276 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:53.535 malloc2 00:18:53.535 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:53.795 [2024-05-14 11:55:20.682243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:53.795 [2024-05-14 11:55:20.682288] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:53.795 [2024-05-14 11:55:20.682312] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c6480 00:18:53.795 [2024-05-14 11:55:20.682324] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:53.795 [2024-05-14 11:55:20.683926] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:53.795 [2024-05-14 11:55:20.683954] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:53.795 pt2 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:53.795 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:54.055 malloc3 00:18:54.055 11:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:54.314 [2024-05-14 11:55:21.165013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:54.314 [2024-05-14 11:55:21.165057] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.314 [2024-05-14 11:55:21.165077] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240ce80 00:18:54.314 [2024-05-14 11:55:21.165089] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.314 [2024-05-14 11:55:21.166653] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.314 [2024-05-14 11:55:21.166681] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:54.314 pt3 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:54.314 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:54.574 malloc4 00:18:54.574 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:54.833 [2024-05-14 11:55:21.660143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:54.833 [2024-05-14 11:55:21.660190] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.833 [2024-05-14 11:55:21.660211] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240f490 00:18:54.833 [2024-05-14 11:55:21.660224] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.833 [2024-05-14 11:55:21.661803] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.833 [2024-05-14 11:55:21.661831] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:54.833 pt4 00:18:54.833 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:18:54.833 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:18:54.833 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:54.833 [2024-05-14 11:55:21.900803] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:54.833 [2024-05-14 11:55:21.902127] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:54.833 [2024-05-14 11:55:21.902181] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:54.833 [2024-05-14 11:55:21.902226] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:54.833 [2024-05-14 11:55:21.902419] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x24107a0 00:18:54.833 [2024-05-14 11:55:21.902431] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:54.833 [2024-05-14 11:55:21.902625] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2410770 00:18:54.833 [2024-05-14 11:55:21.902778] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24107a0 00:18:54.833 [2024-05-14 11:55:21.902789] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24107a0 00:18:54.833 [2024-05-14 11:55:21.902888] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:54.833 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:54.833 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.093 11:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.093 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:55.093 "name": "raid_bdev1", 00:18:55.093 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:18:55.093 "strip_size_kb": 0, 00:18:55.093 "state": "online", 00:18:55.093 "raid_level": "raid1", 00:18:55.093 "superblock": true, 00:18:55.093 "num_base_bdevs": 4, 00:18:55.093 "num_base_bdevs_discovered": 4, 00:18:55.093 "num_base_bdevs_operational": 4, 00:18:55.093 "base_bdevs_list": [ 00:18:55.093 { 00:18:55.093 "name": "pt1", 00:18:55.093 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:18:55.093 "is_configured": true, 00:18:55.093 "data_offset": 2048, 00:18:55.093 "data_size": 63488 00:18:55.094 }, 00:18:55.094 { 00:18:55.094 "name": "pt2", 00:18:55.094 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:18:55.094 "is_configured": true, 00:18:55.094 "data_offset": 2048, 00:18:55.094 "data_size": 63488 00:18:55.094 }, 00:18:55.094 { 00:18:55.094 "name": "pt3", 00:18:55.094 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:18:55.094 "is_configured": true, 00:18:55.094 "data_offset": 2048, 00:18:55.094 "data_size": 63488 00:18:55.094 }, 00:18:55.094 { 00:18:55.094 "name": "pt4", 00:18:55.094 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:18:55.094 "is_configured": true, 00:18:55.094 "data_offset": 2048, 00:18:55.094 "data_size": 63488 00:18:55.094 } 00:18:55.094 ] 00:18:55.094 }' 00:18:55.094 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:55.094 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:18:56.031 [2024-05-14 11:55:22.971878] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:56.031 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:18:56.031 "name": "raid_bdev1", 00:18:56.031 "aliases": [ 00:18:56.031 "932092f3-27e1-487f-9e4d-d7d64c63a0ca" 00:18:56.031 ], 00:18:56.031 "product_name": "Raid Volume", 00:18:56.031 "block_size": 512, 00:18:56.031 "num_blocks": 63488, 00:18:56.031 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:18:56.031 "assigned_rate_limits": { 00:18:56.031 "rw_ios_per_sec": 0, 00:18:56.031 "rw_mbytes_per_sec": 0, 00:18:56.031 "r_mbytes_per_sec": 0, 00:18:56.031 "w_mbytes_per_sec": 0 00:18:56.031 }, 00:18:56.031 "claimed": false, 00:18:56.031 "zoned": false, 00:18:56.031 "supported_io_types": { 00:18:56.031 "read": true, 00:18:56.031 "write": true, 00:18:56.031 "unmap": false, 00:18:56.031 "write_zeroes": true, 00:18:56.031 "flush": false, 00:18:56.031 "reset": true, 00:18:56.031 "compare": false, 00:18:56.031 "compare_and_write": false, 00:18:56.031 "abort": false, 00:18:56.031 "nvme_admin": false, 00:18:56.031 "nvme_io": false 00:18:56.031 }, 00:18:56.031 "memory_domains": [ 00:18:56.031 { 00:18:56.031 "dma_device_id": "system", 00:18:56.031 "dma_device_type": 1 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.031 "dma_device_type": 2 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "system", 00:18:56.031 "dma_device_type": 1 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.031 "dma_device_type": 2 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "system", 00:18:56.031 "dma_device_type": 1 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.031 "dma_device_type": 2 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "system", 00:18:56.031 "dma_device_type": 1 00:18:56.031 }, 00:18:56.031 { 00:18:56.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.031 "dma_device_type": 2 00:18:56.031 } 00:18:56.031 ], 00:18:56.031 "driver_specific": { 00:18:56.031 "raid": { 00:18:56.031 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:18:56.031 "strip_size_kb": 0, 00:18:56.032 "state": "online", 00:18:56.032 "raid_level": "raid1", 00:18:56.032 "superblock": true, 00:18:56.032 "num_base_bdevs": 4, 00:18:56.032 "num_base_bdevs_discovered": 4, 00:18:56.032 "num_base_bdevs_operational": 4, 00:18:56.032 "base_bdevs_list": [ 00:18:56.032 { 00:18:56.032 "name": "pt1", 00:18:56.032 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:18:56.032 "is_configured": true, 00:18:56.032 "data_offset": 2048, 00:18:56.032 "data_size": 63488 00:18:56.032 }, 00:18:56.032 { 00:18:56.032 "name": "pt2", 00:18:56.032 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:18:56.032 "is_configured": true, 00:18:56.032 "data_offset": 2048, 00:18:56.032 "data_size": 63488 00:18:56.032 }, 00:18:56.032 { 00:18:56.032 "name": "pt3", 00:18:56.032 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:18:56.032 "is_configured": true, 00:18:56.032 "data_offset": 2048, 00:18:56.032 "data_size": 63488 00:18:56.032 }, 00:18:56.032 { 00:18:56.032 "name": "pt4", 00:18:56.032 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:18:56.032 "is_configured": true, 00:18:56.032 "data_offset": 2048, 00:18:56.032 "data_size": 63488 00:18:56.032 } 00:18:56.032 ] 00:18:56.032 } 00:18:56.032 } 00:18:56.032 }' 00:18:56.032 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:56.032 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:18:56.032 pt2 00:18:56.032 pt3 00:18:56.032 pt4' 00:18:56.032 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:56.032 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:56.032 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:56.291 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:56.291 "name": "pt1", 00:18:56.291 "aliases": [ 00:18:56.291 "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a" 00:18:56.291 ], 00:18:56.291 "product_name": "passthru", 00:18:56.291 "block_size": 512, 00:18:56.291 "num_blocks": 65536, 00:18:56.291 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:18:56.291 "assigned_rate_limits": { 00:18:56.291 "rw_ios_per_sec": 0, 00:18:56.291 "rw_mbytes_per_sec": 0, 00:18:56.291 "r_mbytes_per_sec": 0, 00:18:56.291 "w_mbytes_per_sec": 0 00:18:56.291 }, 00:18:56.291 "claimed": true, 00:18:56.291 "claim_type": "exclusive_write", 00:18:56.291 "zoned": false, 00:18:56.291 "supported_io_types": { 00:18:56.291 "read": true, 00:18:56.291 "write": true, 00:18:56.291 "unmap": true, 00:18:56.291 "write_zeroes": true, 00:18:56.291 "flush": true, 00:18:56.291 "reset": true, 00:18:56.291 "compare": false, 00:18:56.291 "compare_and_write": false, 00:18:56.291 "abort": true, 00:18:56.291 "nvme_admin": false, 00:18:56.291 "nvme_io": false 00:18:56.291 }, 00:18:56.291 "memory_domains": [ 00:18:56.291 { 00:18:56.291 "dma_device_id": "system", 00:18:56.291 "dma_device_type": 1 00:18:56.291 }, 00:18:56.291 { 00:18:56.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.291 "dma_device_type": 2 00:18:56.291 } 00:18:56.291 ], 00:18:56.291 "driver_specific": { 00:18:56.291 "passthru": { 00:18:56.291 "name": "pt1", 00:18:56.291 "base_bdev_name": "malloc1" 00:18:56.291 } 00:18:56.291 } 00:18:56.291 }' 00:18:56.291 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:56.291 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:56.291 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:56.291 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:56.551 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:56.812 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:56.812 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:56.812 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:56.812 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:57.071 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:57.071 "name": "pt2", 00:18:57.071 "aliases": [ 00:18:57.071 "cf736448-5377-50de-ac96-a57a5d657c4a" 00:18:57.071 ], 00:18:57.071 "product_name": "passthru", 00:18:57.071 "block_size": 512, 00:18:57.071 "num_blocks": 65536, 00:18:57.071 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:18:57.071 "assigned_rate_limits": { 00:18:57.071 "rw_ios_per_sec": 0, 00:18:57.071 "rw_mbytes_per_sec": 0, 00:18:57.071 "r_mbytes_per_sec": 0, 00:18:57.071 "w_mbytes_per_sec": 0 00:18:57.071 }, 00:18:57.071 "claimed": true, 00:18:57.071 "claim_type": "exclusive_write", 00:18:57.071 "zoned": false, 00:18:57.071 "supported_io_types": { 00:18:57.071 "read": true, 00:18:57.071 "write": true, 00:18:57.071 "unmap": true, 00:18:57.071 "write_zeroes": true, 00:18:57.071 "flush": true, 00:18:57.071 "reset": true, 00:18:57.071 "compare": false, 00:18:57.071 "compare_and_write": false, 00:18:57.071 "abort": true, 00:18:57.071 "nvme_admin": false, 00:18:57.071 "nvme_io": false 00:18:57.071 }, 00:18:57.071 "memory_domains": [ 00:18:57.071 { 00:18:57.071 "dma_device_id": "system", 00:18:57.071 "dma_device_type": 1 00:18:57.071 }, 00:18:57.071 { 00:18:57.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.071 "dma_device_type": 2 00:18:57.071 } 00:18:57.071 ], 00:18:57.071 "driver_specific": { 00:18:57.071 "passthru": { 00:18:57.071 "name": "pt2", 00:18:57.071 "base_bdev_name": "malloc2" 00:18:57.071 } 00:18:57.071 } 00:18:57.071 }' 00:18:57.071 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:57.071 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:57.071 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:57.071 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:57.071 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:57.071 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.071 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:57.071 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:57.071 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:57.330 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:57.588 "name": "pt3", 00:18:57.588 "aliases": [ 00:18:57.588 "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c" 00:18:57.588 ], 00:18:57.588 "product_name": "passthru", 00:18:57.588 "block_size": 512, 00:18:57.588 "num_blocks": 65536, 00:18:57.588 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:18:57.588 "assigned_rate_limits": { 00:18:57.588 "rw_ios_per_sec": 0, 00:18:57.588 "rw_mbytes_per_sec": 0, 00:18:57.588 "r_mbytes_per_sec": 0, 00:18:57.588 "w_mbytes_per_sec": 0 00:18:57.588 }, 00:18:57.588 "claimed": true, 00:18:57.588 "claim_type": "exclusive_write", 00:18:57.588 "zoned": false, 00:18:57.588 "supported_io_types": { 00:18:57.588 "read": true, 00:18:57.588 "write": true, 00:18:57.588 "unmap": true, 00:18:57.588 "write_zeroes": true, 00:18:57.588 "flush": true, 00:18:57.588 "reset": true, 00:18:57.588 "compare": false, 00:18:57.588 "compare_and_write": false, 00:18:57.588 "abort": true, 00:18:57.588 "nvme_admin": false, 00:18:57.588 "nvme_io": false 00:18:57.588 }, 00:18:57.588 "memory_domains": [ 00:18:57.588 { 00:18:57.588 "dma_device_id": "system", 00:18:57.588 "dma_device_type": 1 00:18:57.588 }, 00:18:57.588 { 00:18:57.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.588 "dma_device_type": 2 00:18:57.588 } 00:18:57.588 ], 00:18:57.588 "driver_specific": { 00:18:57.588 "passthru": { 00:18:57.588 "name": "pt3", 00:18:57.588 "base_bdev_name": "malloc3" 00:18:57.588 } 00:18:57.588 } 00:18:57.588 }' 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.588 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:57.848 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:18:58.108 "name": "pt4", 00:18:58.108 "aliases": [ 00:18:58.108 "19265a62-0740-53db-bb2d-023cc97d71f6" 00:18:58.108 ], 00:18:58.108 "product_name": "passthru", 00:18:58.108 "block_size": 512, 00:18:58.108 "num_blocks": 65536, 00:18:58.108 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:18:58.108 "assigned_rate_limits": { 00:18:58.108 "rw_ios_per_sec": 0, 00:18:58.108 "rw_mbytes_per_sec": 0, 00:18:58.108 "r_mbytes_per_sec": 0, 00:18:58.108 "w_mbytes_per_sec": 0 00:18:58.108 }, 00:18:58.108 "claimed": true, 00:18:58.108 "claim_type": "exclusive_write", 00:18:58.108 "zoned": false, 00:18:58.108 "supported_io_types": { 00:18:58.108 "read": true, 00:18:58.108 "write": true, 00:18:58.108 "unmap": true, 00:18:58.108 "write_zeroes": true, 00:18:58.108 "flush": true, 00:18:58.108 "reset": true, 00:18:58.108 "compare": false, 00:18:58.108 "compare_and_write": false, 00:18:58.108 "abort": true, 00:18:58.108 "nvme_admin": false, 00:18:58.108 "nvme_io": false 00:18:58.108 }, 00:18:58.108 "memory_domains": [ 00:18:58.108 { 00:18:58.108 "dma_device_id": "system", 00:18:58.108 "dma_device_type": 1 00:18:58.108 }, 00:18:58.108 { 00:18:58.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.108 "dma_device_type": 2 00:18:58.108 } 00:18:58.108 ], 00:18:58.108 "driver_specific": { 00:18:58.108 "passthru": { 00:18:58.108 "name": "pt4", 00:18:58.108 "base_bdev_name": "malloc4" 00:18:58.108 } 00:18:58.108 } 00:18:58.108 }' 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.108 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:58.367 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:18:58.626 [2024-05-14 11:55:25.574881] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:58.626 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=932092f3-27e1-487f-9e4d-d7d64c63a0ca 00:18:58.626 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 932092f3-27e1-487f-9e4d-d7d64c63a0ca ']' 00:18:58.626 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:58.887 [2024-05-14 11:55:25.735033] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:58.887 [2024-05-14 11:55:25.735054] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:58.887 [2024-05-14 11:55:25.735107] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:58.887 [2024-05-14 11:55:25.735203] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:58.887 [2024-05-14 11:55:25.735217] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24107a0 name raid_bdev1, state offline 00:18:58.887 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.887 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:18:59.146 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:18:59.146 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:18:59.146 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:59.146 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:59.146 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:59.146 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:59.406 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:59.406 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:59.666 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:18:59.666 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:59.925 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:59.925 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:00.183 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:00.442 [2024-05-14 11:55:27.407376] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:00.442 [2024-05-14 11:55:27.408775] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:00.442 [2024-05-14 11:55:27.408820] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:00.442 [2024-05-14 11:55:27.408860] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:00.442 [2024-05-14 11:55:27.408911] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:00.442 [2024-05-14 11:55:27.408949] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:00.442 [2024-05-14 11:55:27.408972] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:00.442 [2024-05-14 11:55:27.408994] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:00.442 [2024-05-14 11:55:27.409012] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:00.442 [2024-05-14 11:55:27.409024] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x240f160 name raid_bdev1, state configuring 00:19:00.442 request: 00:19:00.442 { 00:19:00.442 "name": "raid_bdev1", 00:19:00.442 "raid_level": "raid1", 00:19:00.442 "base_bdevs": [ 00:19:00.442 "malloc1", 00:19:00.442 "malloc2", 00:19:00.442 "malloc3", 00:19:00.442 "malloc4" 00:19:00.442 ], 00:19:00.442 "superblock": false, 00:19:00.442 "method": "bdev_raid_create", 00:19:00.442 "req_id": 1 00:19:00.442 } 00:19:00.442 Got JSON-RPC error response 00:19:00.442 response: 00:19:00.442 { 00:19:00.442 "code": -17, 00:19:00.442 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:00.442 } 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.442 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:19:00.700 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:19:00.700 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:19:00.700 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:00.958 [2024-05-14 11:55:27.816412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:00.958 [2024-05-14 11:55:27.816454] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.958 [2024-05-14 11:55:27.816473] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bc040 00:19:00.958 [2024-05-14 11:55:27.816485] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.958 [2024-05-14 11:55:27.818129] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.958 [2024-05-14 11:55:27.818156] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:00.958 [2024-05-14 11:55:27.818226] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:19:00.958 [2024-05-14 11:55:27.818253] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:00.958 pt1 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.958 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.217 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:01.217 "name": "raid_bdev1", 00:19:01.217 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:01.217 "strip_size_kb": 0, 00:19:01.217 "state": "configuring", 00:19:01.217 "raid_level": "raid1", 00:19:01.217 "superblock": true, 00:19:01.217 "num_base_bdevs": 4, 00:19:01.217 "num_base_bdevs_discovered": 1, 00:19:01.217 "num_base_bdevs_operational": 4, 00:19:01.217 "base_bdevs_list": [ 00:19:01.217 { 00:19:01.217 "name": "pt1", 00:19:01.217 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:01.217 "is_configured": true, 00:19:01.217 "data_offset": 2048, 00:19:01.217 "data_size": 63488 00:19:01.217 }, 00:19:01.217 { 00:19:01.217 "name": null, 00:19:01.217 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:01.217 "is_configured": false, 00:19:01.217 "data_offset": 2048, 00:19:01.217 "data_size": 63488 00:19:01.217 }, 00:19:01.217 { 00:19:01.217 "name": null, 00:19:01.217 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:01.217 "is_configured": false, 00:19:01.217 "data_offset": 2048, 00:19:01.217 "data_size": 63488 00:19:01.217 }, 00:19:01.217 { 00:19:01.217 "name": null, 00:19:01.217 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:01.217 "is_configured": false, 00:19:01.217 "data_offset": 2048, 00:19:01.217 "data_size": 63488 00:19:01.217 } 00:19:01.217 ] 00:19:01.217 }' 00:19:01.217 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:01.217 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.785 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:19:01.785 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:01.785 [2024-05-14 11:55:28.799177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:01.785 [2024-05-14 11:55:28.799226] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.785 [2024-05-14 11:55:28.799250] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2414aa0 00:19:01.785 [2024-05-14 11:55:28.799263] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.785 [2024-05-14 11:55:28.799619] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.785 [2024-05-14 11:55:28.799638] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:01.785 [2024-05-14 11:55:28.799702] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:01.785 [2024-05-14 11:55:28.799721] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:01.785 pt2 00:19:01.785 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:02.045 [2024-05-14 11:55:28.955615] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.045 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.303 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:02.303 "name": "raid_bdev1", 00:19:02.303 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:02.303 "strip_size_kb": 0, 00:19:02.303 "state": "configuring", 00:19:02.303 "raid_level": "raid1", 00:19:02.303 "superblock": true, 00:19:02.303 "num_base_bdevs": 4, 00:19:02.303 "num_base_bdevs_discovered": 1, 00:19:02.303 "num_base_bdevs_operational": 4, 00:19:02.303 "base_bdevs_list": [ 00:19:02.303 { 00:19:02.303 "name": "pt1", 00:19:02.303 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:02.303 "is_configured": true, 00:19:02.303 "data_offset": 2048, 00:19:02.303 "data_size": 63488 00:19:02.303 }, 00:19:02.303 { 00:19:02.303 "name": null, 00:19:02.303 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:02.303 "is_configured": false, 00:19:02.303 "data_offset": 2048, 00:19:02.303 "data_size": 63488 00:19:02.303 }, 00:19:02.303 { 00:19:02.303 "name": null, 00:19:02.303 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:02.303 "is_configured": false, 00:19:02.303 "data_offset": 2048, 00:19:02.303 "data_size": 63488 00:19:02.303 }, 00:19:02.303 { 00:19:02.303 "name": null, 00:19:02.303 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:02.303 "is_configured": false, 00:19:02.303 "data_offset": 2048, 00:19:02.303 "data_size": 63488 00:19:02.303 } 00:19:02.303 ] 00:19:02.303 }' 00:19:02.303 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:02.303 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.870 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:19:02.870 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:02.870 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:03.128 [2024-05-14 11:55:30.018428] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:03.128 [2024-05-14 11:55:30.018477] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:03.128 [2024-05-14 11:55:30.018502] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2410cf0 00:19:03.128 [2024-05-14 11:55:30.018514] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:03.128 [2024-05-14 11:55:30.018852] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:03.128 [2024-05-14 11:55:30.018869] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:03.129 [2024-05-14 11:55:30.018933] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:03.129 [2024-05-14 11:55:30.018952] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:03.129 pt2 00:19:03.129 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:03.129 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:03.129 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:03.129 [2024-05-14 11:55:30.210931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:03.129 [2024-05-14 11:55:30.210973] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:03.129 [2024-05-14 11:55:30.210992] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240a7b0 00:19:03.129 [2024-05-14 11:55:30.211004] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:03.129 [2024-05-14 11:55:30.211325] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:03.129 [2024-05-14 11:55:30.211343] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:03.129 [2024-05-14 11:55:30.211414] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:03.129 [2024-05-14 11:55:30.211435] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:03.387 pt3 00:19:03.387 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:03.387 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:03.387 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:03.387 [2024-05-14 11:55:30.459586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:03.387 [2024-05-14 11:55:30.459628] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:03.387 [2024-05-14 11:55:30.459650] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2412930 00:19:03.387 [2024-05-14 11:55:30.459663] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:03.387 [2024-05-14 11:55:30.459972] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:03.387 [2024-05-14 11:55:30.459988] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:03.387 [2024-05-14 11:55:30.460047] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:03.387 [2024-05-14 11:55:30.460065] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:03.387 [2024-05-14 11:55:30.460186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x240f860 00:19:03.387 [2024-05-14 11:55:30.460197] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:03.387 [2024-05-14 11:55:30.460372] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24149b0 00:19:03.387 [2024-05-14 11:55:30.460531] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x240f860 00:19:03.387 [2024-05-14 11:55:30.460542] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x240f860 00:19:03.387 [2024-05-14 11:55:30.460639] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.387 pt4 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:03.645 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.646 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.904 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:03.904 "name": "raid_bdev1", 00:19:03.904 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:03.904 "strip_size_kb": 0, 00:19:03.904 "state": "online", 00:19:03.904 "raid_level": "raid1", 00:19:03.904 "superblock": true, 00:19:03.904 "num_base_bdevs": 4, 00:19:03.904 "num_base_bdevs_discovered": 4, 00:19:03.904 "num_base_bdevs_operational": 4, 00:19:03.904 "base_bdevs_list": [ 00:19:03.904 { 00:19:03.904 "name": "pt1", 00:19:03.904 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:03.904 "is_configured": true, 00:19:03.904 "data_offset": 2048, 00:19:03.904 "data_size": 63488 00:19:03.904 }, 00:19:03.904 { 00:19:03.904 "name": "pt2", 00:19:03.904 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:03.904 "is_configured": true, 00:19:03.904 "data_offset": 2048, 00:19:03.904 "data_size": 63488 00:19:03.904 }, 00:19:03.904 { 00:19:03.904 "name": "pt3", 00:19:03.904 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:03.904 "is_configured": true, 00:19:03.904 "data_offset": 2048, 00:19:03.904 "data_size": 63488 00:19:03.904 }, 00:19:03.904 { 00:19:03.904 "name": "pt4", 00:19:03.904 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:03.904 "is_configured": true, 00:19:03.904 "data_offset": 2048, 00:19:03.904 "data_size": 63488 00:19:03.904 } 00:19:03.904 ] 00:19:03.904 }' 00:19:03.904 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:03.904 11:55:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:19:04.470 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:04.470 [2024-05-14 11:55:31.546720] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:19:04.729 "name": "raid_bdev1", 00:19:04.729 "aliases": [ 00:19:04.729 "932092f3-27e1-487f-9e4d-d7d64c63a0ca" 00:19:04.729 ], 00:19:04.729 "product_name": "Raid Volume", 00:19:04.729 "block_size": 512, 00:19:04.729 "num_blocks": 63488, 00:19:04.729 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:04.729 "assigned_rate_limits": { 00:19:04.729 "rw_ios_per_sec": 0, 00:19:04.729 "rw_mbytes_per_sec": 0, 00:19:04.729 "r_mbytes_per_sec": 0, 00:19:04.729 "w_mbytes_per_sec": 0 00:19:04.729 }, 00:19:04.729 "claimed": false, 00:19:04.729 "zoned": false, 00:19:04.729 "supported_io_types": { 00:19:04.729 "read": true, 00:19:04.729 "write": true, 00:19:04.729 "unmap": false, 00:19:04.729 "write_zeroes": true, 00:19:04.729 "flush": false, 00:19:04.729 "reset": true, 00:19:04.729 "compare": false, 00:19:04.729 "compare_and_write": false, 00:19:04.729 "abort": false, 00:19:04.729 "nvme_admin": false, 00:19:04.729 "nvme_io": false 00:19:04.729 }, 00:19:04.729 "memory_domains": [ 00:19:04.729 { 00:19:04.729 "dma_device_id": "system", 00:19:04.729 "dma_device_type": 1 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.729 "dma_device_type": 2 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "system", 00:19:04.729 "dma_device_type": 1 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.729 "dma_device_type": 2 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "system", 00:19:04.729 "dma_device_type": 1 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.729 "dma_device_type": 2 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "system", 00:19:04.729 "dma_device_type": 1 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.729 "dma_device_type": 2 00:19:04.729 } 00:19:04.729 ], 00:19:04.729 "driver_specific": { 00:19:04.729 "raid": { 00:19:04.729 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:04.729 "strip_size_kb": 0, 00:19:04.729 "state": "online", 00:19:04.729 "raid_level": "raid1", 00:19:04.729 "superblock": true, 00:19:04.729 "num_base_bdevs": 4, 00:19:04.729 "num_base_bdevs_discovered": 4, 00:19:04.729 "num_base_bdevs_operational": 4, 00:19:04.729 "base_bdevs_list": [ 00:19:04.729 { 00:19:04.729 "name": "pt1", 00:19:04.729 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:04.729 "is_configured": true, 00:19:04.729 "data_offset": 2048, 00:19:04.729 "data_size": 63488 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "name": "pt2", 00:19:04.729 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:04.729 "is_configured": true, 00:19:04.729 "data_offset": 2048, 00:19:04.729 "data_size": 63488 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "name": "pt3", 00:19:04.729 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:04.729 "is_configured": true, 00:19:04.729 "data_offset": 2048, 00:19:04.729 "data_size": 63488 00:19:04.729 }, 00:19:04.729 { 00:19:04.729 "name": "pt4", 00:19:04.729 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:04.729 "is_configured": true, 00:19:04.729 "data_offset": 2048, 00:19:04.729 "data_size": 63488 00:19:04.729 } 00:19:04.729 ] 00:19:04.729 } 00:19:04.729 } 00:19:04.729 }' 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:19:04.729 pt2 00:19:04.729 pt3 00:19:04.729 pt4' 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:04.729 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:04.988 "name": "pt1", 00:19:04.988 "aliases": [ 00:19:04.988 "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a" 00:19:04.988 ], 00:19:04.988 "product_name": "passthru", 00:19:04.988 "block_size": 512, 00:19:04.988 "num_blocks": 65536, 00:19:04.988 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:04.988 "assigned_rate_limits": { 00:19:04.988 "rw_ios_per_sec": 0, 00:19:04.988 "rw_mbytes_per_sec": 0, 00:19:04.988 "r_mbytes_per_sec": 0, 00:19:04.988 "w_mbytes_per_sec": 0 00:19:04.988 }, 00:19:04.988 "claimed": true, 00:19:04.988 "claim_type": "exclusive_write", 00:19:04.988 "zoned": false, 00:19:04.988 "supported_io_types": { 00:19:04.988 "read": true, 00:19:04.988 "write": true, 00:19:04.988 "unmap": true, 00:19:04.988 "write_zeroes": true, 00:19:04.988 "flush": true, 00:19:04.988 "reset": true, 00:19:04.988 "compare": false, 00:19:04.988 "compare_and_write": false, 00:19:04.988 "abort": true, 00:19:04.988 "nvme_admin": false, 00:19:04.988 "nvme_io": false 00:19:04.988 }, 00:19:04.988 "memory_domains": [ 00:19:04.988 { 00:19:04.988 "dma_device_id": "system", 00:19:04.988 "dma_device_type": 1 00:19:04.988 }, 00:19:04.988 { 00:19:04.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.988 "dma_device_type": 2 00:19:04.988 } 00:19:04.988 ], 00:19:04.988 "driver_specific": { 00:19:04.988 "passthru": { 00:19:04.988 "name": "pt1", 00:19:04.988 "base_bdev_name": "malloc1" 00:19:04.988 } 00:19:04.988 } 00:19:04.988 }' 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:04.988 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:04.988 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:04.988 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:05.264 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:05.571 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:05.571 "name": "pt2", 00:19:05.571 "aliases": [ 00:19:05.571 "cf736448-5377-50de-ac96-a57a5d657c4a" 00:19:05.571 ], 00:19:05.571 "product_name": "passthru", 00:19:05.571 "block_size": 512, 00:19:05.571 "num_blocks": 65536, 00:19:05.571 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:05.571 "assigned_rate_limits": { 00:19:05.571 "rw_ios_per_sec": 0, 00:19:05.571 "rw_mbytes_per_sec": 0, 00:19:05.571 "r_mbytes_per_sec": 0, 00:19:05.571 "w_mbytes_per_sec": 0 00:19:05.571 }, 00:19:05.571 "claimed": true, 00:19:05.571 "claim_type": "exclusive_write", 00:19:05.571 "zoned": false, 00:19:05.571 "supported_io_types": { 00:19:05.571 "read": true, 00:19:05.571 "write": true, 00:19:05.571 "unmap": true, 00:19:05.571 "write_zeroes": true, 00:19:05.571 "flush": true, 00:19:05.571 "reset": true, 00:19:05.571 "compare": false, 00:19:05.571 "compare_and_write": false, 00:19:05.571 "abort": true, 00:19:05.571 "nvme_admin": false, 00:19:05.571 "nvme_io": false 00:19:05.571 }, 00:19:05.571 "memory_domains": [ 00:19:05.571 { 00:19:05.571 "dma_device_id": "system", 00:19:05.571 "dma_device_type": 1 00:19:05.571 }, 00:19:05.571 { 00:19:05.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.571 "dma_device_type": 2 00:19:05.571 } 00:19:05.571 ], 00:19:05.572 "driver_specific": { 00:19:05.572 "passthru": { 00:19:05.572 "name": "pt2", 00:19:05.572 "base_bdev_name": "malloc2" 00:19:05.572 } 00:19:05.572 } 00:19:05.572 }' 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:05.572 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:05.830 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:06.089 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:06.089 "name": "pt3", 00:19:06.089 "aliases": [ 00:19:06.089 "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c" 00:19:06.089 ], 00:19:06.089 "product_name": "passthru", 00:19:06.089 "block_size": 512, 00:19:06.089 "num_blocks": 65536, 00:19:06.089 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:06.089 "assigned_rate_limits": { 00:19:06.089 "rw_ios_per_sec": 0, 00:19:06.089 "rw_mbytes_per_sec": 0, 00:19:06.089 "r_mbytes_per_sec": 0, 00:19:06.089 "w_mbytes_per_sec": 0 00:19:06.089 }, 00:19:06.089 "claimed": true, 00:19:06.089 "claim_type": "exclusive_write", 00:19:06.089 "zoned": false, 00:19:06.089 "supported_io_types": { 00:19:06.089 "read": true, 00:19:06.089 "write": true, 00:19:06.089 "unmap": true, 00:19:06.089 "write_zeroes": true, 00:19:06.089 "flush": true, 00:19:06.089 "reset": true, 00:19:06.089 "compare": false, 00:19:06.089 "compare_and_write": false, 00:19:06.089 "abort": true, 00:19:06.089 "nvme_admin": false, 00:19:06.089 "nvme_io": false 00:19:06.089 }, 00:19:06.089 "memory_domains": [ 00:19:06.089 { 00:19:06.089 "dma_device_id": "system", 00:19:06.089 "dma_device_type": 1 00:19:06.089 }, 00:19:06.089 { 00:19:06.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.089 "dma_device_type": 2 00:19:06.089 } 00:19:06.089 ], 00:19:06.089 "driver_specific": { 00:19:06.089 "passthru": { 00:19:06.089 "name": "pt3", 00:19:06.089 "base_bdev_name": "malloc3" 00:19:06.089 } 00:19:06.089 } 00:19:06.089 }' 00:19:06.089 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.089 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.089 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:06.089 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:06.347 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:06.348 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:19:06.348 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:19:06.348 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:06.607 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:19:06.607 "name": "pt4", 00:19:06.607 "aliases": [ 00:19:06.607 "19265a62-0740-53db-bb2d-023cc97d71f6" 00:19:06.607 ], 00:19:06.607 "product_name": "passthru", 00:19:06.607 "block_size": 512, 00:19:06.607 "num_blocks": 65536, 00:19:06.607 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:06.607 "assigned_rate_limits": { 00:19:06.607 "rw_ios_per_sec": 0, 00:19:06.607 "rw_mbytes_per_sec": 0, 00:19:06.607 "r_mbytes_per_sec": 0, 00:19:06.607 "w_mbytes_per_sec": 0 00:19:06.607 }, 00:19:06.607 "claimed": true, 00:19:06.607 "claim_type": "exclusive_write", 00:19:06.607 "zoned": false, 00:19:06.607 "supported_io_types": { 00:19:06.607 "read": true, 00:19:06.607 "write": true, 00:19:06.607 "unmap": true, 00:19:06.607 "write_zeroes": true, 00:19:06.607 "flush": true, 00:19:06.607 "reset": true, 00:19:06.607 "compare": false, 00:19:06.607 "compare_and_write": false, 00:19:06.607 "abort": true, 00:19:06.607 "nvme_admin": false, 00:19:06.607 "nvme_io": false 00:19:06.607 }, 00:19:06.607 "memory_domains": [ 00:19:06.607 { 00:19:06.607 "dma_device_id": "system", 00:19:06.607 "dma_device_type": 1 00:19:06.607 }, 00:19:06.607 { 00:19:06.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.607 "dma_device_type": 2 00:19:06.607 } 00:19:06.607 ], 00:19:06.607 "driver_specific": { 00:19:06.607 "passthru": { 00:19:06.607 "name": "pt4", 00:19:06.607 "base_bdev_name": "malloc4" 00:19:06.607 } 00:19:06.607 } 00:19:06.607 }' 00:19:06.607 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:06.866 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:07.125 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:07.125 [2024-05-14 11:55:34.173727] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 932092f3-27e1-487f-9e4d-d7d64c63a0ca '!=' 932092f3-27e1-487f-9e4d-d7d64c63a0ca ']' 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:19:07.125 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:07.384 [2024-05-14 11:55:34.414100] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.384 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.643 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:07.643 "name": "raid_bdev1", 00:19:07.643 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:07.643 "strip_size_kb": 0, 00:19:07.643 "state": "online", 00:19:07.643 "raid_level": "raid1", 00:19:07.643 "superblock": true, 00:19:07.643 "num_base_bdevs": 4, 00:19:07.643 "num_base_bdevs_discovered": 3, 00:19:07.643 "num_base_bdevs_operational": 3, 00:19:07.643 "base_bdevs_list": [ 00:19:07.643 { 00:19:07.643 "name": null, 00:19:07.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.643 "is_configured": false, 00:19:07.643 "data_offset": 2048, 00:19:07.643 "data_size": 63488 00:19:07.643 }, 00:19:07.643 { 00:19:07.643 "name": "pt2", 00:19:07.643 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:07.643 "is_configured": true, 00:19:07.643 "data_offset": 2048, 00:19:07.643 "data_size": 63488 00:19:07.643 }, 00:19:07.643 { 00:19:07.643 "name": "pt3", 00:19:07.643 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:07.643 "is_configured": true, 00:19:07.643 "data_offset": 2048, 00:19:07.643 "data_size": 63488 00:19:07.643 }, 00:19:07.643 { 00:19:07.643 "name": "pt4", 00:19:07.643 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:07.643 "is_configured": true, 00:19:07.643 "data_offset": 2048, 00:19:07.643 "data_size": 63488 00:19:07.643 } 00:19:07.643 ] 00:19:07.643 }' 00:19:07.643 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:07.643 11:55:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.211 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:08.470 [2024-05-14 11:55:35.404706] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:08.470 [2024-05-14 11:55:35.404738] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:08.470 [2024-05-14 11:55:35.404795] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:08.470 [2024-05-14 11:55:35.404873] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:08.470 [2024-05-14 11:55:35.404886] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x240f860 name raid_bdev1, state offline 00:19:08.470 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.470 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:19:08.729 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:19:08.729 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:19:08.729 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:19:08.729 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:08.729 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:08.988 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:08.988 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:08.988 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:09.251 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:09.251 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:09.251 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:09.510 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:19:09.510 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:19:09.510 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:19:09.510 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:09.510 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:09.769 [2024-05-14 11:55:36.659947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:09.769 [2024-05-14 11:55:36.659994] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.769 [2024-05-14 11:55:36.660015] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2410a40 00:19:09.769 [2024-05-14 11:55:36.660027] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.769 [2024-05-14 11:55:36.661923] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.769 [2024-05-14 11:55:36.661955] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:09.769 [2024-05-14 11:55:36.662024] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:09.769 [2024-05-14 11:55:36.662051] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:09.769 pt2 00:19:09.769 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:09.769 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:09.769 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:09.769 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:09.769 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.770 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.028 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:10.028 "name": "raid_bdev1", 00:19:10.028 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:10.028 "strip_size_kb": 0, 00:19:10.028 "state": "configuring", 00:19:10.028 "raid_level": "raid1", 00:19:10.028 "superblock": true, 00:19:10.028 "num_base_bdevs": 4, 00:19:10.028 "num_base_bdevs_discovered": 1, 00:19:10.028 "num_base_bdevs_operational": 3, 00:19:10.028 "base_bdevs_list": [ 00:19:10.028 { 00:19:10.028 "name": null, 00:19:10.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.028 "is_configured": false, 00:19:10.028 "data_offset": 2048, 00:19:10.028 "data_size": 63488 00:19:10.028 }, 00:19:10.028 { 00:19:10.028 "name": "pt2", 00:19:10.028 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:10.028 "is_configured": true, 00:19:10.028 "data_offset": 2048, 00:19:10.028 "data_size": 63488 00:19:10.028 }, 00:19:10.028 { 00:19:10.028 "name": null, 00:19:10.028 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:10.028 "is_configured": false, 00:19:10.028 "data_offset": 2048, 00:19:10.028 "data_size": 63488 00:19:10.028 }, 00:19:10.028 { 00:19:10.028 "name": null, 00:19:10.028 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:10.028 "is_configured": false, 00:19:10.028 "data_offset": 2048, 00:19:10.028 "data_size": 63488 00:19:10.028 } 00:19:10.028 ] 00:19:10.028 }' 00:19:10.028 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:10.028 11:55:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.597 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:19:10.597 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:10.597 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:10.856 [2024-05-14 11:55:37.746838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:10.856 [2024-05-14 11:55:37.746890] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:10.856 [2024-05-14 11:55:37.746911] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2409bd0 00:19:10.856 [2024-05-14 11:55:37.746924] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:10.856 [2024-05-14 11:55:37.747302] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:10.856 [2024-05-14 11:55:37.747322] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:10.856 [2024-05-14 11:55:37.747387] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:10.856 [2024-05-14 11:55:37.747416] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:10.856 pt3 00:19:10.856 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:10.856 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:10.856 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.857 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.116 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:11.116 "name": "raid_bdev1", 00:19:11.116 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:11.116 "strip_size_kb": 0, 00:19:11.116 "state": "configuring", 00:19:11.116 "raid_level": "raid1", 00:19:11.116 "superblock": true, 00:19:11.116 "num_base_bdevs": 4, 00:19:11.116 "num_base_bdevs_discovered": 2, 00:19:11.116 "num_base_bdevs_operational": 3, 00:19:11.116 "base_bdevs_list": [ 00:19:11.116 { 00:19:11.116 "name": null, 00:19:11.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.116 "is_configured": false, 00:19:11.116 "data_offset": 2048, 00:19:11.116 "data_size": 63488 00:19:11.116 }, 00:19:11.116 { 00:19:11.116 "name": "pt2", 00:19:11.116 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:11.116 "is_configured": true, 00:19:11.116 "data_offset": 2048, 00:19:11.116 "data_size": 63488 00:19:11.116 }, 00:19:11.116 { 00:19:11.116 "name": "pt3", 00:19:11.116 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:11.116 "is_configured": true, 00:19:11.116 "data_offset": 2048, 00:19:11.116 "data_size": 63488 00:19:11.116 }, 00:19:11.116 { 00:19:11.116 "name": null, 00:19:11.116 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:11.116 "is_configured": false, 00:19:11.116 "data_offset": 2048, 00:19:11.116 "data_size": 63488 00:19:11.116 } 00:19:11.116 ] 00:19:11.116 }' 00:19:11.116 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:11.116 11:55:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.685 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:19:11.685 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:19:11.685 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=3 00:19:11.685 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:11.945 [2024-05-14 11:55:38.845747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:11.945 [2024-05-14 11:55:38.845795] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.945 [2024-05-14 11:55:38.845822] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2410430 00:19:11.945 [2024-05-14 11:55:38.845834] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.945 [2024-05-14 11:55:38.846204] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.945 [2024-05-14 11:55:38.846224] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:11.945 [2024-05-14 11:55:38.846285] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:11.945 [2024-05-14 11:55:38.846305] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:11.945 [2024-05-14 11:55:38.846430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x240b530 00:19:11.945 [2024-05-14 11:55:38.846441] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:11.945 [2024-05-14 11:55:38.846614] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2410770 00:19:11.945 [2024-05-14 11:55:38.846752] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x240b530 00:19:11.945 [2024-05-14 11:55:38.846762] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x240b530 00:19:11.945 [2024-05-14 11:55:38.846858] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.945 pt4 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.945 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:12.205 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:12.205 "name": "raid_bdev1", 00:19:12.205 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:12.205 "strip_size_kb": 0, 00:19:12.205 "state": "online", 00:19:12.205 "raid_level": "raid1", 00:19:12.205 "superblock": true, 00:19:12.205 "num_base_bdevs": 4, 00:19:12.205 "num_base_bdevs_discovered": 3, 00:19:12.205 "num_base_bdevs_operational": 3, 00:19:12.205 "base_bdevs_list": [ 00:19:12.205 { 00:19:12.205 "name": null, 00:19:12.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.205 "is_configured": false, 00:19:12.205 "data_offset": 2048, 00:19:12.205 "data_size": 63488 00:19:12.205 }, 00:19:12.205 { 00:19:12.205 "name": "pt2", 00:19:12.205 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:12.205 "is_configured": true, 00:19:12.205 "data_offset": 2048, 00:19:12.205 "data_size": 63488 00:19:12.205 }, 00:19:12.205 { 00:19:12.205 "name": "pt3", 00:19:12.205 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:12.205 "is_configured": true, 00:19:12.205 "data_offset": 2048, 00:19:12.205 "data_size": 63488 00:19:12.205 }, 00:19:12.205 { 00:19:12.205 "name": "pt4", 00:19:12.205 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:12.205 "is_configured": true, 00:19:12.205 "data_offset": 2048, 00:19:12.205 "data_size": 63488 00:19:12.205 } 00:19:12.205 ] 00:19:12.205 }' 00:19:12.205 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:12.205 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.772 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # '[' 4 -gt 2 ']' 00:19:12.772 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:13.031 [2024-05-14 11:55:39.868447] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:13.031 [2024-05-14 11:55:39.868470] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:13.031 [2024-05-14 11:55:39.868524] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:13.031 [2024-05-14 11:55:39.868591] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:13.031 [2024-05-14 11:55:39.868602] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x240b530 name raid_bdev1, state offline 00:19:13.031 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.031 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # jq -r '.[]' 00:19:13.291 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # raid_bdev= 00:19:13.291 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@529 -- # '[' -n '' ']' 00:19:13.291 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:13.551 [2024-05-14 11:55:40.377783] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:13.551 [2024-05-14 11:55:40.377833] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:13.551 [2024-05-14 11:55:40.377854] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240df50 00:19:13.551 [2024-05-14 11:55:40.377867] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:13.551 [2024-05-14 11:55:40.379739] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:13.551 [2024-05-14 11:55:40.379768] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:13.551 [2024-05-14 11:55:40.379833] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:19:13.551 [2024-05-14 11:55:40.379860] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:13.551 pt1 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.551 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.810 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:13.810 "name": "raid_bdev1", 00:19:13.810 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:13.810 "strip_size_kb": 0, 00:19:13.810 "state": "configuring", 00:19:13.810 "raid_level": "raid1", 00:19:13.810 "superblock": true, 00:19:13.810 "num_base_bdevs": 4, 00:19:13.810 "num_base_bdevs_discovered": 1, 00:19:13.810 "num_base_bdevs_operational": 4, 00:19:13.810 "base_bdevs_list": [ 00:19:13.810 { 00:19:13.810 "name": "pt1", 00:19:13.810 "uuid": "870588a0-96f4-5e25-bea3-7bdb9ae6fa6a", 00:19:13.810 "is_configured": true, 00:19:13.810 "data_offset": 2048, 00:19:13.810 "data_size": 63488 00:19:13.810 }, 00:19:13.810 { 00:19:13.810 "name": null, 00:19:13.810 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:13.810 "is_configured": false, 00:19:13.810 "data_offset": 2048, 00:19:13.810 "data_size": 63488 00:19:13.810 }, 00:19:13.810 { 00:19:13.810 "name": null, 00:19:13.810 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:13.810 "is_configured": false, 00:19:13.810 "data_offset": 2048, 00:19:13.810 "data_size": 63488 00:19:13.810 }, 00:19:13.810 { 00:19:13.810 "name": null, 00:19:13.810 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:13.810 "is_configured": false, 00:19:13.810 "data_offset": 2048, 00:19:13.810 "data_size": 63488 00:19:13.810 } 00:19:13.810 ] 00:19:13.810 }' 00:19:13.810 11:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:13.810 11:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i = 1 )) 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:14.379 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:14.639 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:14.639 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:14.639 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:14.898 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i++ )) 00:19:14.898 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # (( i < num_base_bdevs )) 00:19:14.898 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # i=3 00:19:14.898 11:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:15.158 [2024-05-14 11:55:41.986069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:15.158 [2024-05-14 11:55:41.986115] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:15.158 [2024-05-14 11:55:41.986134] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2410a40 00:19:15.158 [2024-05-14 11:55:41.986147] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:15.158 [2024-05-14 11:55:41.986511] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:15.158 [2024-05-14 11:55:41.986529] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:15.158 [2024-05-14 11:55:41.986590] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt4 00:19:15.158 [2024-05-14 11:55:41.986601] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt4 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:15.158 [2024-05-14 11:55:41.986611] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:15.158 [2024-05-14 11:55:41.986625] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24a5c70 name raid_bdev1, state configuring 00:19:15.158 [2024-05-14 11:55:41.986658] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:15.158 pt4 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@551 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.158 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.417 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:15.417 "name": "raid_bdev1", 00:19:15.417 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:15.417 "strip_size_kb": 0, 00:19:15.417 "state": "configuring", 00:19:15.417 "raid_level": "raid1", 00:19:15.417 "superblock": true, 00:19:15.417 "num_base_bdevs": 4, 00:19:15.417 "num_base_bdevs_discovered": 1, 00:19:15.417 "num_base_bdevs_operational": 3, 00:19:15.417 "base_bdevs_list": [ 00:19:15.417 { 00:19:15.417 "name": null, 00:19:15.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.417 "is_configured": false, 00:19:15.417 "data_offset": 2048, 00:19:15.417 "data_size": 63488 00:19:15.417 }, 00:19:15.417 { 00:19:15.417 "name": null, 00:19:15.417 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:15.417 "is_configured": false, 00:19:15.417 "data_offset": 2048, 00:19:15.417 "data_size": 63488 00:19:15.417 }, 00:19:15.417 { 00:19:15.417 "name": null, 00:19:15.417 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:15.417 "is_configured": false, 00:19:15.417 "data_offset": 2048, 00:19:15.417 "data_size": 63488 00:19:15.417 }, 00:19:15.417 { 00:19:15.417 "name": "pt4", 00:19:15.417 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:15.417 "is_configured": true, 00:19:15.417 "data_offset": 2048, 00:19:15.417 "data_size": 63488 00:19:15.417 } 00:19:15.417 ] 00:19:15.417 }' 00:19:15.417 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:15.417 11:55:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.984 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i = 1 )) 00:19:15.984 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:15.984 11:55:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:15.984 [2024-05-14 11:55:43.064942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:15.984 [2024-05-14 11:55:43.064993] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:15.984 [2024-05-14 11:55:43.065013] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2414670 00:19:15.984 [2024-05-14 11:55:43.065025] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:15.984 [2024-05-14 11:55:43.065386] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:15.984 [2024-05-14 11:55:43.065425] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:15.984 [2024-05-14 11:55:43.065487] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:19:15.984 [2024-05-14 11:55:43.065506] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:16.244 pt2 00:19:16.244 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:19:16.244 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:16.244 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:16.244 [2024-05-14 11:55:43.317612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:16.244 [2024-05-14 11:55:43.317640] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:16.244 [2024-05-14 11:55:43.317657] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2410310 00:19:16.244 [2024-05-14 11:55:43.317669] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:16.244 [2024-05-14 11:55:43.317946] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:16.244 [2024-05-14 11:55:43.317970] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:16.244 [2024-05-14 11:55:43.318017] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt3 00:19:16.244 [2024-05-14 11:55:43.318033] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:16.244 [2024-05-14 11:55:43.318143] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2409bd0 00:19:16.244 [2024-05-14 11:55:43.318154] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:16.244 [2024-05-14 11:55:43.318318] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2411cb0 00:19:16.244 [2024-05-14 11:55:43.318464] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2409bd0 00:19:16.244 [2024-05-14 11:55:43.318475] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2409bd0 00:19:16.244 [2024-05-14 11:55:43.318573] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.244 pt3 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i++ )) 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # (( i < num_base_bdevs - 1 )) 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@559 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.504 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.765 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:16.765 "name": "raid_bdev1", 00:19:16.765 "uuid": "932092f3-27e1-487f-9e4d-d7d64c63a0ca", 00:19:16.765 "strip_size_kb": 0, 00:19:16.765 "state": "online", 00:19:16.765 "raid_level": "raid1", 00:19:16.765 "superblock": true, 00:19:16.765 "num_base_bdevs": 4, 00:19:16.765 "num_base_bdevs_discovered": 3, 00:19:16.765 "num_base_bdevs_operational": 3, 00:19:16.765 "base_bdevs_list": [ 00:19:16.765 { 00:19:16.765 "name": null, 00:19:16.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.765 "is_configured": false, 00:19:16.765 "data_offset": 2048, 00:19:16.765 "data_size": 63488 00:19:16.765 }, 00:19:16.765 { 00:19:16.765 "name": "pt2", 00:19:16.765 "uuid": "cf736448-5377-50de-ac96-a57a5d657c4a", 00:19:16.765 "is_configured": true, 00:19:16.765 "data_offset": 2048, 00:19:16.765 "data_size": 63488 00:19:16.765 }, 00:19:16.765 { 00:19:16.765 "name": "pt3", 00:19:16.765 "uuid": "bf6f1a51-c89d-5ade-b625-2bc9c9046d9c", 00:19:16.765 "is_configured": true, 00:19:16.765 "data_offset": 2048, 00:19:16.765 "data_size": 63488 00:19:16.765 }, 00:19:16.765 { 00:19:16.765 "name": "pt4", 00:19:16.765 "uuid": "19265a62-0740-53db-bb2d-023cc97d71f6", 00:19:16.765 "is_configured": true, 00:19:16.765 "data_offset": 2048, 00:19:16.765 "data_size": 63488 00:19:16.765 } 00:19:16.765 ] 00:19:16.765 }' 00:19:16.765 11:55:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:16.765 11:55:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.333 11:55:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:17.333 11:55:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:19:17.333 [2024-05-14 11:55:44.408759] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # '[' 932092f3-27e1-487f-9e4d-d7d64c63a0ca '!=' 932092f3-27e1-487f-9e4d-d7d64c63a0ca ']' 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@568 -- # killprocess 1746308 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 1746308 ']' 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 1746308 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1746308 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1746308' 00:19:17.593 killing process with pid 1746308 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 1746308 00:19:17.593 [2024-05-14 11:55:44.474266] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:17.593 [2024-05-14 11:55:44.474325] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:17.593 [2024-05-14 11:55:44.474394] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:17.593 [2024-05-14 11:55:44.474412] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2409bd0 name raid_bdev1, state offline 00:19:17.593 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 1746308 00:19:17.593 [2024-05-14 11:55:44.542658] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:17.853 11:55:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # return 0 00:19:17.853 00:19:17.853 real 0m26.095s 00:19:17.853 user 0m47.698s 00:19:17.853 sys 0m4.572s 00:19:17.853 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:17.853 11:55:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.853 ************************************ 00:19:17.853 END TEST raid_superblock_test 00:19:17.853 ************************************ 00:19:17.853 11:55:44 bdev_raid -- bdev/bdev_raid.sh@821 -- # '[' true = true ']' 00:19:17.853 11:55:44 bdev_raid -- bdev/bdev_raid.sh@822 -- # for n in 2 4 00:19:17.853 11:55:44 bdev_raid -- bdev/bdev_raid.sh@823 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:19:17.853 11:55:44 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:19:17.853 11:55:44 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:17.853 11:55:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:18.112 ************************************ 00:19:18.112 START TEST raid_rebuild_test 00:19:18.112 ************************************ 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false false true 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local verify=true 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # local strip_size 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@582 -- # local create_arg 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local data_offset 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # raid_pid=1750303 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # waitforlisten 1750303 /var/tmp/spdk-raid.sock 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 1750303 ']' 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:18.112 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.112 11:55:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:18.112 [2024-05-14 11:55:45.033709] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:19:18.112 [2024-05-14 11:55:45.033770] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1750303 ] 00:19:18.113 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:18.113 Zero copy mechanism will not be used. 00:19:18.113 [2024-05-14 11:55:45.162352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:18.372 [2024-05-14 11:55:45.269701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:18.372 [2024-05-14 11:55:45.340750] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.372 [2024-05-14 11:55:45.340787] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.941 11:55:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:18.941 11:55:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:19:18.941 11:55:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:18.941 11:55:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:19.201 BaseBdev1_malloc 00:19:19.201 11:55:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:19.492 [2024-05-14 11:55:46.434677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:19.492 [2024-05-14 11:55:46.434726] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.492 [2024-05-14 11:55:46.434749] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x133c960 00:19:19.492 [2024-05-14 11:55:46.434762] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.492 [2024-05-14 11:55:46.436508] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.492 [2024-05-14 11:55:46.436536] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:19.492 BaseBdev1 00:19:19.492 11:55:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:19.492 11:55:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:19.752 BaseBdev2_malloc 00:19:19.752 11:55:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:20.010 [2024-05-14 11:55:46.928840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:20.010 [2024-05-14 11:55:46.928888] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:20.011 [2024-05-14 11:55:46.928912] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14efb40 00:19:20.011 [2024-05-14 11:55:46.928925] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:20.011 [2024-05-14 11:55:46.930521] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:20.011 [2024-05-14 11:55:46.930550] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:20.011 BaseBdev2 00:19:20.011 11:55:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:20.270 spare_malloc 00:19:20.270 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:20.529 spare_delay 00:19:20.529 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:20.788 [2024-05-14 11:55:47.660561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:20.788 [2024-05-14 11:55:47.660605] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:20.788 [2024-05-14 11:55:47.660625] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1338700 00:19:20.788 [2024-05-14 11:55:47.660638] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:20.788 [2024-05-14 11:55:47.662239] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:20.788 [2024-05-14 11:55:47.662268] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:20.788 spare 00:19:20.788 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:21.048 [2024-05-14 11:55:47.901227] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:21.048 [2024-05-14 11:55:47.902572] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:21.048 [2024-05-14 11:55:47.902649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1337a50 00:19:21.048 [2024-05-14 11:55:47.902661] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:21.048 [2024-05-14 11:55:47.902866] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13352d0 00:19:21.048 [2024-05-14 11:55:47.903011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1337a50 00:19:21.048 [2024-05-14 11:55:47.903021] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1337a50 00:19:21.048 [2024-05-14 11:55:47.903134] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.048 11:55:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:21.306 11:55:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:21.306 "name": "raid_bdev1", 00:19:21.306 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:21.306 "strip_size_kb": 0, 00:19:21.306 "state": "online", 00:19:21.306 "raid_level": "raid1", 00:19:21.306 "superblock": false, 00:19:21.306 "num_base_bdevs": 2, 00:19:21.306 "num_base_bdevs_discovered": 2, 00:19:21.306 "num_base_bdevs_operational": 2, 00:19:21.306 "base_bdevs_list": [ 00:19:21.306 { 00:19:21.306 "name": "BaseBdev1", 00:19:21.306 "uuid": "c4438fb5-eda5-5221-a615-e49df46a7af5", 00:19:21.306 "is_configured": true, 00:19:21.306 "data_offset": 0, 00:19:21.306 "data_size": 65536 00:19:21.306 }, 00:19:21.306 { 00:19:21.306 "name": "BaseBdev2", 00:19:21.306 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:21.306 "is_configured": true, 00:19:21.306 "data_offset": 0, 00:19:21.306 "data_size": 65536 00:19:21.306 } 00:19:21.306 ] 00:19:21.306 }' 00:19:21.306 11:55:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:21.306 11:55:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.873 11:55:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:21.873 11:55:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:19:22.132 [2024-05-14 11:55:48.976473] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:22.132 11:55:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:19:22.132 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.132 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:22.391 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:22.392 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:22.392 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:22.392 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:22.392 [2024-05-14 11:55:49.469614] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1337960 00:19:22.651 /dev/nbd0 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:22.651 1+0 records in 00:19:22.651 1+0 records out 00:19:22.651 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259663 s, 15.8 MB/s 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:19:22.651 11:55:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:27.924 65536+0 records in 00:19:27.924 65536+0 records out 00:19:27.924 33554432 bytes (34 MB, 32 MiB) copied, 4.72852 s, 7.1 MB/s 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:27.924 [2024-05-14 11:55:54.532015] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:27.924 [2024-05-14 11:55:54.756362] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.924 11:55:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.183 11:55:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:28.183 "name": "raid_bdev1", 00:19:28.183 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:28.183 "strip_size_kb": 0, 00:19:28.183 "state": "online", 00:19:28.183 "raid_level": "raid1", 00:19:28.183 "superblock": false, 00:19:28.183 "num_base_bdevs": 2, 00:19:28.183 "num_base_bdevs_discovered": 1, 00:19:28.183 "num_base_bdevs_operational": 1, 00:19:28.183 "base_bdevs_list": [ 00:19:28.183 { 00:19:28.183 "name": null, 00:19:28.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:28.183 "is_configured": false, 00:19:28.183 "data_offset": 0, 00:19:28.183 "data_size": 65536 00:19:28.183 }, 00:19:28.183 { 00:19:28.183 "name": "BaseBdev2", 00:19:28.183 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:28.183 "is_configured": true, 00:19:28.183 "data_offset": 0, 00:19:28.183 "data_size": 65536 00:19:28.183 } 00:19:28.183 ] 00:19:28.183 }' 00:19:28.183 11:55:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:28.183 11:55:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.752 11:55:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:29.012 [2024-05-14 11:55:55.843258] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:29.012 [2024-05-14 11:55:55.848256] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13352d0 00:19:29.012 [2024-05-14 11:55:55.850490] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:29.012 11:55:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # sleep 1 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.948 11:55:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:30.207 "name": "raid_bdev1", 00:19:30.207 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:30.207 "strip_size_kb": 0, 00:19:30.207 "state": "online", 00:19:30.207 "raid_level": "raid1", 00:19:30.207 "superblock": false, 00:19:30.207 "num_base_bdevs": 2, 00:19:30.207 "num_base_bdevs_discovered": 2, 00:19:30.207 "num_base_bdevs_operational": 2, 00:19:30.207 "process": { 00:19:30.207 "type": "rebuild", 00:19:30.207 "target": "spare", 00:19:30.207 "progress": { 00:19:30.207 "blocks": 24576, 00:19:30.207 "percent": 37 00:19:30.207 } 00:19:30.207 }, 00:19:30.207 "base_bdevs_list": [ 00:19:30.207 { 00:19:30.207 "name": "spare", 00:19:30.207 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:30.207 "is_configured": true, 00:19:30.207 "data_offset": 0, 00:19:30.207 "data_size": 65536 00:19:30.207 }, 00:19:30.207 { 00:19:30.207 "name": "BaseBdev2", 00:19:30.207 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:30.207 "is_configured": true, 00:19:30.207 "data_offset": 0, 00:19:30.207 "data_size": 65536 00:19:30.207 } 00:19:30.207 ] 00:19:30.207 }' 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:30.207 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:30.465 [2024-05-14 11:55:57.437714] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:30.465 [2024-05-14 11:55:57.463191] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:30.465 [2024-05-14 11:55:57.463236] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:30.465 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:30.465 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:30.465 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.466 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:30.723 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:30.723 "name": "raid_bdev1", 00:19:30.723 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:30.723 "strip_size_kb": 0, 00:19:30.723 "state": "online", 00:19:30.723 "raid_level": "raid1", 00:19:30.723 "superblock": false, 00:19:30.723 "num_base_bdevs": 2, 00:19:30.723 "num_base_bdevs_discovered": 1, 00:19:30.723 "num_base_bdevs_operational": 1, 00:19:30.723 "base_bdevs_list": [ 00:19:30.723 { 00:19:30.723 "name": null, 00:19:30.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.723 "is_configured": false, 00:19:30.723 "data_offset": 0, 00:19:30.723 "data_size": 65536 00:19:30.723 }, 00:19:30.723 { 00:19:30.723 "name": "BaseBdev2", 00:19:30.723 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:30.723 "is_configured": true, 00:19:30.723 "data_offset": 0, 00:19:30.723 "data_size": 65536 00:19:30.723 } 00:19:30.723 ] 00:19:30.723 }' 00:19:30.723 11:55:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:30.723 11:55:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.289 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.547 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:31.547 "name": "raid_bdev1", 00:19:31.547 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:31.547 "strip_size_kb": 0, 00:19:31.547 "state": "online", 00:19:31.547 "raid_level": "raid1", 00:19:31.547 "superblock": false, 00:19:31.547 "num_base_bdevs": 2, 00:19:31.547 "num_base_bdevs_discovered": 1, 00:19:31.547 "num_base_bdevs_operational": 1, 00:19:31.547 "base_bdevs_list": [ 00:19:31.547 { 00:19:31.547 "name": null, 00:19:31.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.547 "is_configured": false, 00:19:31.547 "data_offset": 0, 00:19:31.547 "data_size": 65536 00:19:31.547 }, 00:19:31.547 { 00:19:31.547 "name": "BaseBdev2", 00:19:31.547 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:31.547 "is_configured": true, 00:19:31.547 "data_offset": 0, 00:19:31.547 "data_size": 65536 00:19:31.547 } 00:19:31.547 ] 00:19:31.547 }' 00:19:31.547 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:31.547 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:31.547 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:31.805 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:31.805 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:32.065 [2024-05-14 11:55:58.895440] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:32.065 [2024-05-14 11:55:58.900380] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1333b70 00:19:32.065 [2024-05-14 11:55:58.901854] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:32.065 11:55:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # sleep 1 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.999 11:55:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:33.257 "name": "raid_bdev1", 00:19:33.257 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:33.257 "strip_size_kb": 0, 00:19:33.257 "state": "online", 00:19:33.257 "raid_level": "raid1", 00:19:33.257 "superblock": false, 00:19:33.257 "num_base_bdevs": 2, 00:19:33.257 "num_base_bdevs_discovered": 2, 00:19:33.257 "num_base_bdevs_operational": 2, 00:19:33.257 "process": { 00:19:33.257 "type": "rebuild", 00:19:33.257 "target": "spare", 00:19:33.257 "progress": { 00:19:33.257 "blocks": 24576, 00:19:33.257 "percent": 37 00:19:33.257 } 00:19:33.257 }, 00:19:33.257 "base_bdevs_list": [ 00:19:33.257 { 00:19:33.257 "name": "spare", 00:19:33.257 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:33.257 "is_configured": true, 00:19:33.257 "data_offset": 0, 00:19:33.257 "data_size": 65536 00:19:33.257 }, 00:19:33.257 { 00:19:33.257 "name": "BaseBdev2", 00:19:33.257 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:33.257 "is_configured": true, 00:19:33.257 "data_offset": 0, 00:19:33.257 "data_size": 65536 00:19:33.257 } 00:19:33.257 ] 00:19:33.257 }' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # local timeout=616 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.257 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.516 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:33.516 "name": "raid_bdev1", 00:19:33.516 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:33.516 "strip_size_kb": 0, 00:19:33.516 "state": "online", 00:19:33.516 "raid_level": "raid1", 00:19:33.516 "superblock": false, 00:19:33.516 "num_base_bdevs": 2, 00:19:33.516 "num_base_bdevs_discovered": 2, 00:19:33.516 "num_base_bdevs_operational": 2, 00:19:33.516 "process": { 00:19:33.516 "type": "rebuild", 00:19:33.516 "target": "spare", 00:19:33.516 "progress": { 00:19:33.516 "blocks": 30720, 00:19:33.516 "percent": 46 00:19:33.516 } 00:19:33.516 }, 00:19:33.516 "base_bdevs_list": [ 00:19:33.516 { 00:19:33.516 "name": "spare", 00:19:33.516 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:33.516 "is_configured": true, 00:19:33.516 "data_offset": 0, 00:19:33.516 "data_size": 65536 00:19:33.516 }, 00:19:33.516 { 00:19:33.516 "name": "BaseBdev2", 00:19:33.516 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:33.516 "is_configured": true, 00:19:33.516 "data_offset": 0, 00:19:33.516 "data_size": 65536 00:19:33.516 } 00:19:33.516 ] 00:19:33.516 }' 00:19:33.516 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:33.516 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:33.516 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:33.774 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:33.774 11:56:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:34.708 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:34.709 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.709 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.967 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:34.967 "name": "raid_bdev1", 00:19:34.967 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:34.967 "strip_size_kb": 0, 00:19:34.967 "state": "online", 00:19:34.967 "raid_level": "raid1", 00:19:34.967 "superblock": false, 00:19:34.967 "num_base_bdevs": 2, 00:19:34.967 "num_base_bdevs_discovered": 2, 00:19:34.967 "num_base_bdevs_operational": 2, 00:19:34.967 "process": { 00:19:34.967 "type": "rebuild", 00:19:34.967 "target": "spare", 00:19:34.967 "progress": { 00:19:34.967 "blocks": 59392, 00:19:34.967 "percent": 90 00:19:34.967 } 00:19:34.967 }, 00:19:34.967 "base_bdevs_list": [ 00:19:34.967 { 00:19:34.967 "name": "spare", 00:19:34.967 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:34.967 "is_configured": true, 00:19:34.967 "data_offset": 0, 00:19:34.967 "data_size": 65536 00:19:34.967 }, 00:19:34.967 { 00:19:34.967 "name": "BaseBdev2", 00:19:34.967 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:34.967 "is_configured": true, 00:19:34.967 "data_offset": 0, 00:19:34.967 "data_size": 65536 00:19:34.967 } 00:19:34.968 ] 00:19:34.968 }' 00:19:34.968 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:34.968 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:34.968 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:34.968 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:34.968 11:56:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:19:35.226 [2024-05-14 11:56:02.126639] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:35.226 [2024-05-14 11:56:02.126695] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:35.226 [2024-05-14 11:56:02.126731] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.160 11:56:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.160 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:36.160 "name": "raid_bdev1", 00:19:36.160 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:36.160 "strip_size_kb": 0, 00:19:36.160 "state": "online", 00:19:36.160 "raid_level": "raid1", 00:19:36.160 "superblock": false, 00:19:36.160 "num_base_bdevs": 2, 00:19:36.160 "num_base_bdevs_discovered": 2, 00:19:36.160 "num_base_bdevs_operational": 2, 00:19:36.160 "base_bdevs_list": [ 00:19:36.160 { 00:19:36.160 "name": "spare", 00:19:36.160 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:36.160 "is_configured": true, 00:19:36.160 "data_offset": 0, 00:19:36.160 "data_size": 65536 00:19:36.160 }, 00:19:36.160 { 00:19:36.160 "name": "BaseBdev2", 00:19:36.160 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:36.160 "is_configured": true, 00:19:36.160 "data_offset": 0, 00:19:36.160 "data_size": 65536 00:19:36.160 } 00:19:36.160 ] 00:19:36.160 }' 00:19:36.160 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:36.160 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:36.160 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # break 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.418 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:36.677 "name": "raid_bdev1", 00:19:36.677 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:36.677 "strip_size_kb": 0, 00:19:36.677 "state": "online", 00:19:36.677 "raid_level": "raid1", 00:19:36.677 "superblock": false, 00:19:36.677 "num_base_bdevs": 2, 00:19:36.677 "num_base_bdevs_discovered": 2, 00:19:36.677 "num_base_bdevs_operational": 2, 00:19:36.677 "base_bdevs_list": [ 00:19:36.677 { 00:19:36.677 "name": "spare", 00:19:36.677 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:36.677 "is_configured": true, 00:19:36.677 "data_offset": 0, 00:19:36.677 "data_size": 65536 00:19:36.677 }, 00:19:36.677 { 00:19:36.677 "name": "BaseBdev2", 00:19:36.677 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:36.677 "is_configured": true, 00:19:36.677 "data_offset": 0, 00:19:36.677 "data_size": 65536 00:19:36.677 } 00:19:36.677 ] 00:19:36.677 }' 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.677 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.939 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:36.939 "name": "raid_bdev1", 00:19:36.939 "uuid": "edf3d389-dc88-4425-adc1-b04d1adb8110", 00:19:36.939 "strip_size_kb": 0, 00:19:36.939 "state": "online", 00:19:36.939 "raid_level": "raid1", 00:19:36.939 "superblock": false, 00:19:36.939 "num_base_bdevs": 2, 00:19:36.939 "num_base_bdevs_discovered": 2, 00:19:36.939 "num_base_bdevs_operational": 2, 00:19:36.939 "base_bdevs_list": [ 00:19:36.939 { 00:19:36.939 "name": "spare", 00:19:36.939 "uuid": "6d29804a-1db4-51f3-9a23-84c9406c99ad", 00:19:36.939 "is_configured": true, 00:19:36.939 "data_offset": 0, 00:19:36.939 "data_size": 65536 00:19:36.939 }, 00:19:36.939 { 00:19:36.939 "name": "BaseBdev2", 00:19:36.939 "uuid": "b2f83859-cb14-519c-81c6-df306aacad4f", 00:19:36.939 "is_configured": true, 00:19:36.939 "data_offset": 0, 00:19:36.939 "data_size": 65536 00:19:36.939 } 00:19:36.939 ] 00:19:36.939 }' 00:19:36.939 11:56:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:36.939 11:56:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.532 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:37.791 [2024-05-14 11:56:04.650170] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:37.791 [2024-05-14 11:56:04.650197] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:37.791 [2024-05-14 11:56:04.650256] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:37.791 [2024-05-14 11:56:04.650312] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:37.791 [2024-05-14 11:56:04.650324] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1337a50 name raid_bdev1, state offline 00:19:37.791 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.791 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # jq length 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:38.050 11:56:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:38.309 /dev/nbd0 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:38.309 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:38.310 1+0 records in 00:19:38.310 1+0 records out 00:19:38.310 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000181168 s, 22.6 MB/s 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:38.310 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:38.569 /dev/nbd1 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:38.569 1+0 records in 00:19:38.569 1+0 records out 00:19:38.569 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308052 s, 13.3 MB/s 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.569 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:38.827 11:56:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@795 -- # killprocess 1750303 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 1750303 ']' 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 1750303 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1750303 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1750303' 00:19:39.087 killing process with pid 1750303 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 1750303 00:19:39.087 Received shutdown signal, test time was about 60.000000 seconds 00:19:39.087 00:19:39.087 Latency(us) 00:19:39.087 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:39.087 =================================================================================================================== 00:19:39.087 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:39.087 [2024-05-14 11:56:06.151419] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:39.087 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 1750303 00:19:39.346 [2024-05-14 11:56:06.178817] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:39.346 11:56:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@797 -- # return 0 00:19:39.346 00:19:39.346 real 0m21.430s 00:19:39.346 user 0m29.352s 00:19:39.346 sys 0m4.657s 00:19:39.346 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:39.346 11:56:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.346 ************************************ 00:19:39.346 END TEST raid_rebuild_test 00:19:39.346 ************************************ 00:19:39.606 11:56:06 bdev_raid -- bdev/bdev_raid.sh@824 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:19:39.606 11:56:06 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:19:39.606 11:56:06 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:39.606 11:56:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:39.606 ************************************ 00:19:39.606 START TEST raid_rebuild_test_sb 00:19:39.606 ************************************ 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local verify=true 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # local strip_size 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@582 -- # local create_arg 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local data_offset 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # raid_pid=1753260 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # waitforlisten 1753260 /var/tmp/spdk-raid.sock 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1753260 ']' 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:39.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:39.606 11:56:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:39.606 [2024-05-14 11:56:06.560848] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:19:39.606 [2024-05-14 11:56:06.560916] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1753260 ] 00:19:39.606 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:39.606 Zero copy mechanism will not be used. 00:19:39.606 [2024-05-14 11:56:06.690238] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:39.866 [2024-05-14 11:56:06.789282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:39.866 [2024-05-14 11:56:06.855535] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.866 [2024-05-14 11:56:06.855574] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:40.433 11:56:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:40.433 11:56:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:19:40.433 11:56:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:40.433 11:56:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:40.691 BaseBdev1_malloc 00:19:40.691 11:56:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:40.949 [2024-05-14 11:56:07.967773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:40.949 [2024-05-14 11:56:07.967823] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.949 [2024-05-14 11:56:07.967845] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a9960 00:19:40.949 [2024-05-14 11:56:07.967859] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.949 [2024-05-14 11:56:07.969490] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.949 [2024-05-14 11:56:07.969518] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:40.949 BaseBdev1 00:19:40.949 11:56:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:19:40.949 11:56:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:41.209 BaseBdev2_malloc 00:19:41.209 11:56:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:41.468 [2024-05-14 11:56:08.461935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:41.468 [2024-05-14 11:56:08.461974] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.468 [2024-05-14 11:56:08.461996] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b5cb40 00:19:41.468 [2024-05-14 11:56:08.462009] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.468 [2024-05-14 11:56:08.463390] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.468 [2024-05-14 11:56:08.463424] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:41.468 BaseBdev2 00:19:41.468 11:56:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:41.728 spare_malloc 00:19:41.728 11:56:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:41.986 spare_delay 00:19:41.986 11:56:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:42.246 [2024-05-14 11:56:09.188344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:42.246 [2024-05-14 11:56:09.188384] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.246 [2024-05-14 11:56:09.188410] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a5700 00:19:42.246 [2024-05-14 11:56:09.188423] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.246 [2024-05-14 11:56:09.189832] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.246 [2024-05-14 11:56:09.189865] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:42.246 spare 00:19:42.246 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:42.505 [2024-05-14 11:56:09.433020] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:42.505 [2024-05-14 11:56:09.434285] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:42.505 [2024-05-14 11:56:09.434455] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19a4a50 00:19:42.505 [2024-05-14 11:56:09.434469] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:42.505 [2024-05-14 11:56:09.434669] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a22d0 00:19:42.505 [2024-05-14 11:56:09.434812] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19a4a50 00:19:42.505 [2024-05-14 11:56:09.434822] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19a4a50 00:19:42.505 [2024-05-14 11:56:09.434918] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.505 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.764 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:42.764 "name": "raid_bdev1", 00:19:42.764 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:42.764 "strip_size_kb": 0, 00:19:42.764 "state": "online", 00:19:42.764 "raid_level": "raid1", 00:19:42.764 "superblock": true, 00:19:42.764 "num_base_bdevs": 2, 00:19:42.764 "num_base_bdevs_discovered": 2, 00:19:42.764 "num_base_bdevs_operational": 2, 00:19:42.764 "base_bdevs_list": [ 00:19:42.764 { 00:19:42.764 "name": "BaseBdev1", 00:19:42.764 "uuid": "628eaa17-640a-5596-a860-455d6ef2e960", 00:19:42.764 "is_configured": true, 00:19:42.764 "data_offset": 2048, 00:19:42.764 "data_size": 63488 00:19:42.764 }, 00:19:42.764 { 00:19:42.764 "name": "BaseBdev2", 00:19:42.764 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:42.764 "is_configured": true, 00:19:42.764 "data_offset": 2048, 00:19:42.764 "data_size": 63488 00:19:42.764 } 00:19:42.764 ] 00:19:42.764 }' 00:19:42.764 11:56:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:42.764 11:56:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.331 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:43.331 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:19:43.590 [2024-05-14 11:56:10.508252] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:43.590 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:19:43.590 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.590 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:43.849 11:56:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:44.107 [2024-05-14 11:56:11.005404] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a4960 00:19:44.108 /dev/nbd0 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:44.108 1+0 records in 00:19:44.108 1+0 records out 00:19:44.108 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257188 s, 15.9 MB/s 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:19:44.108 11:56:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:19:48.299 63488+0 records in 00:19:48.299 63488+0 records out 00:19:48.299 32505856 bytes (33 MB, 31 MiB) copied, 4.21788 s, 7.7 MB/s 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:48.299 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:48.558 [2024-05-14 11:56:15.549993] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:48.558 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:48.817 [2024-05-14 11:56:15.786681] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.817 11:56:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.077 11:56:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:49.077 "name": "raid_bdev1", 00:19:49.077 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:49.077 "strip_size_kb": 0, 00:19:49.077 "state": "online", 00:19:49.077 "raid_level": "raid1", 00:19:49.077 "superblock": true, 00:19:49.077 "num_base_bdevs": 2, 00:19:49.077 "num_base_bdevs_discovered": 1, 00:19:49.077 "num_base_bdevs_operational": 1, 00:19:49.077 "base_bdevs_list": [ 00:19:49.077 { 00:19:49.077 "name": null, 00:19:49.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.077 "is_configured": false, 00:19:49.077 "data_offset": 2048, 00:19:49.077 "data_size": 63488 00:19:49.077 }, 00:19:49.077 { 00:19:49.077 "name": "BaseBdev2", 00:19:49.077 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:49.077 "is_configured": true, 00:19:49.077 "data_offset": 2048, 00:19:49.077 "data_size": 63488 00:19:49.077 } 00:19:49.077 ] 00:19:49.077 }' 00:19:49.077 11:56:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:49.077 11:56:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.646 11:56:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:49.906 [2024-05-14 11:56:16.865591] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:49.906 [2024-05-14 11:56:16.870545] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a22d0 00:19:49.906 [2024-05-14 11:56:16.872805] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:49.906 11:56:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # sleep 1 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.843 11:56:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.102 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:51.102 "name": "raid_bdev1", 00:19:51.102 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:51.102 "strip_size_kb": 0, 00:19:51.102 "state": "online", 00:19:51.102 "raid_level": "raid1", 00:19:51.102 "superblock": true, 00:19:51.102 "num_base_bdevs": 2, 00:19:51.102 "num_base_bdevs_discovered": 2, 00:19:51.102 "num_base_bdevs_operational": 2, 00:19:51.102 "process": { 00:19:51.102 "type": "rebuild", 00:19:51.102 "target": "spare", 00:19:51.102 "progress": { 00:19:51.102 "blocks": 24576, 00:19:51.102 "percent": 38 00:19:51.102 } 00:19:51.102 }, 00:19:51.102 "base_bdevs_list": [ 00:19:51.102 { 00:19:51.102 "name": "spare", 00:19:51.102 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:51.102 "is_configured": true, 00:19:51.102 "data_offset": 2048, 00:19:51.102 "data_size": 63488 00:19:51.102 }, 00:19:51.102 { 00:19:51.102 "name": "BaseBdev2", 00:19:51.102 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:51.102 "is_configured": true, 00:19:51.102 "data_offset": 2048, 00:19:51.102 "data_size": 63488 00:19:51.102 } 00:19:51.102 ] 00:19:51.102 }' 00:19:51.102 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:51.102 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:51.102 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:51.361 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:51.361 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:51.620 [2024-05-14 11:56:18.447316] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:51.620 [2024-05-14 11:56:18.485394] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:51.621 [2024-05-14 11:56:18.485445] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.621 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.880 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:51.880 "name": "raid_bdev1", 00:19:51.880 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:51.880 "strip_size_kb": 0, 00:19:51.880 "state": "online", 00:19:51.880 "raid_level": "raid1", 00:19:51.880 "superblock": true, 00:19:51.880 "num_base_bdevs": 2, 00:19:51.880 "num_base_bdevs_discovered": 1, 00:19:51.880 "num_base_bdevs_operational": 1, 00:19:51.880 "base_bdevs_list": [ 00:19:51.880 { 00:19:51.880 "name": null, 00:19:51.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.880 "is_configured": false, 00:19:51.880 "data_offset": 2048, 00:19:51.880 "data_size": 63488 00:19:51.880 }, 00:19:51.880 { 00:19:51.880 "name": "BaseBdev2", 00:19:51.880 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:51.880 "is_configured": true, 00:19:51.880 "data_offset": 2048, 00:19:51.880 "data_size": 63488 00:19:51.880 } 00:19:51.880 ] 00:19:51.880 }' 00:19:51.880 11:56:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:51.880 11:56:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:52.484 "name": "raid_bdev1", 00:19:52.484 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:52.484 "strip_size_kb": 0, 00:19:52.484 "state": "online", 00:19:52.484 "raid_level": "raid1", 00:19:52.484 "superblock": true, 00:19:52.484 "num_base_bdevs": 2, 00:19:52.484 "num_base_bdevs_discovered": 1, 00:19:52.484 "num_base_bdevs_operational": 1, 00:19:52.484 "base_bdevs_list": [ 00:19:52.484 { 00:19:52.484 "name": null, 00:19:52.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.484 "is_configured": false, 00:19:52.484 "data_offset": 2048, 00:19:52.484 "data_size": 63488 00:19:52.484 }, 00:19:52.484 { 00:19:52.484 "name": "BaseBdev2", 00:19:52.484 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:52.484 "is_configured": true, 00:19:52.484 "data_offset": 2048, 00:19:52.484 "data_size": 63488 00:19:52.484 } 00:19:52.484 ] 00:19:52.484 }' 00:19:52.484 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:52.767 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:52.767 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:52.767 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:52.767 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:52.767 [2024-05-14 11:56:19.797888] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:52.767 [2024-05-14 11:56:19.802871] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a1260 00:19:52.767 [2024-05-14 11:56:19.804381] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:52.767 11:56:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # sleep 1 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.145 11:56:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.145 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:54.145 "name": "raid_bdev1", 00:19:54.145 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:54.145 "strip_size_kb": 0, 00:19:54.145 "state": "online", 00:19:54.145 "raid_level": "raid1", 00:19:54.146 "superblock": true, 00:19:54.146 "num_base_bdevs": 2, 00:19:54.146 "num_base_bdevs_discovered": 2, 00:19:54.146 "num_base_bdevs_operational": 2, 00:19:54.146 "process": { 00:19:54.146 "type": "rebuild", 00:19:54.146 "target": "spare", 00:19:54.146 "progress": { 00:19:54.146 "blocks": 22528, 00:19:54.146 "percent": 35 00:19:54.146 } 00:19:54.146 }, 00:19:54.146 "base_bdevs_list": [ 00:19:54.146 { 00:19:54.146 "name": "spare", 00:19:54.146 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:54.146 "is_configured": true, 00:19:54.146 "data_offset": 2048, 00:19:54.146 "data_size": 63488 00:19:54.146 }, 00:19:54.146 { 00:19:54.146 "name": "BaseBdev2", 00:19:54.146 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:54.146 "is_configured": true, 00:19:54.146 "data_offset": 2048, 00:19:54.146 "data_size": 63488 00:19:54.146 } 00:19:54.146 ] 00:19:54.146 }' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:19:54.146 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # local timeout=637 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.146 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:54.405 "name": "raid_bdev1", 00:19:54.405 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:54.405 "strip_size_kb": 0, 00:19:54.405 "state": "online", 00:19:54.405 "raid_level": "raid1", 00:19:54.405 "superblock": true, 00:19:54.405 "num_base_bdevs": 2, 00:19:54.405 "num_base_bdevs_discovered": 2, 00:19:54.405 "num_base_bdevs_operational": 2, 00:19:54.405 "process": { 00:19:54.405 "type": "rebuild", 00:19:54.405 "target": "spare", 00:19:54.405 "progress": { 00:19:54.405 "blocks": 30720, 00:19:54.405 "percent": 48 00:19:54.405 } 00:19:54.405 }, 00:19:54.405 "base_bdevs_list": [ 00:19:54.405 { 00:19:54.405 "name": "spare", 00:19:54.405 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:54.405 "is_configured": true, 00:19:54.405 "data_offset": 2048, 00:19:54.405 "data_size": 63488 00:19:54.405 }, 00:19:54.405 { 00:19:54.405 "name": "BaseBdev2", 00:19:54.405 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:54.405 "is_configured": true, 00:19:54.405 "data_offset": 2048, 00:19:54.405 "data_size": 63488 00:19:54.405 } 00:19:54.405 ] 00:19:54.405 }' 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:54.405 11:56:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:55.785 "name": "raid_bdev1", 00:19:55.785 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:55.785 "strip_size_kb": 0, 00:19:55.785 "state": "online", 00:19:55.785 "raid_level": "raid1", 00:19:55.785 "superblock": true, 00:19:55.785 "num_base_bdevs": 2, 00:19:55.785 "num_base_bdevs_discovered": 2, 00:19:55.785 "num_base_bdevs_operational": 2, 00:19:55.785 "process": { 00:19:55.785 "type": "rebuild", 00:19:55.785 "target": "spare", 00:19:55.785 "progress": { 00:19:55.785 "blocks": 57344, 00:19:55.785 "percent": 90 00:19:55.785 } 00:19:55.785 }, 00:19:55.785 "base_bdevs_list": [ 00:19:55.785 { 00:19:55.785 "name": "spare", 00:19:55.785 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:55.785 "is_configured": true, 00:19:55.785 "data_offset": 2048, 00:19:55.785 "data_size": 63488 00:19:55.785 }, 00:19:55.785 { 00:19:55.785 "name": "BaseBdev2", 00:19:55.785 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:55.785 "is_configured": true, 00:19:55.785 "data_offset": 2048, 00:19:55.785 "data_size": 63488 00:19:55.785 } 00:19:55.785 ] 00:19:55.785 }' 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:55.785 11:56:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:19:56.044 [2024-05-14 11:56:22.928343] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:56.044 [2024-05-14 11:56:22.928408] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:56.044 [2024-05-14 11:56:22.928489] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.979 11:56:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.979 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:56.979 "name": "raid_bdev1", 00:19:56.979 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:56.979 "strip_size_kb": 0, 00:19:56.979 "state": "online", 00:19:56.979 "raid_level": "raid1", 00:19:56.979 "superblock": true, 00:19:56.979 "num_base_bdevs": 2, 00:19:56.979 "num_base_bdevs_discovered": 2, 00:19:56.979 "num_base_bdevs_operational": 2, 00:19:56.979 "base_bdevs_list": [ 00:19:56.979 { 00:19:56.979 "name": "spare", 00:19:56.979 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:56.979 "is_configured": true, 00:19:56.979 "data_offset": 2048, 00:19:56.979 "data_size": 63488 00:19:56.979 }, 00:19:56.979 { 00:19:56.979 "name": "BaseBdev2", 00:19:56.979 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:56.979 "is_configured": true, 00:19:56.979 "data_offset": 2048, 00:19:56.979 "data_size": 63488 00:19:56.979 } 00:19:56.979 ] 00:19:56.979 }' 00:19:56.979 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # break 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:57.237 "name": "raid_bdev1", 00:19:57.237 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:57.237 "strip_size_kb": 0, 00:19:57.237 "state": "online", 00:19:57.237 "raid_level": "raid1", 00:19:57.237 "superblock": true, 00:19:57.237 "num_base_bdevs": 2, 00:19:57.237 "num_base_bdevs_discovered": 2, 00:19:57.237 "num_base_bdevs_operational": 2, 00:19:57.237 "base_bdevs_list": [ 00:19:57.237 { 00:19:57.237 "name": "spare", 00:19:57.237 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:57.237 "is_configured": true, 00:19:57.237 "data_offset": 2048, 00:19:57.237 "data_size": 63488 00:19:57.237 }, 00:19:57.237 { 00:19:57.237 "name": "BaseBdev2", 00:19:57.237 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:57.237 "is_configured": true, 00:19:57.237 "data_offset": 2048, 00:19:57.237 "data_size": 63488 00:19:57.237 } 00:19:57.237 ] 00:19:57.237 }' 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:57.237 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.495 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:57.495 "name": "raid_bdev1", 00:19:57.495 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:19:57.495 "strip_size_kb": 0, 00:19:57.495 "state": "online", 00:19:57.495 "raid_level": "raid1", 00:19:57.495 "superblock": true, 00:19:57.495 "num_base_bdevs": 2, 00:19:57.495 "num_base_bdevs_discovered": 2, 00:19:57.495 "num_base_bdevs_operational": 2, 00:19:57.495 "base_bdevs_list": [ 00:19:57.495 { 00:19:57.495 "name": "spare", 00:19:57.495 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:19:57.495 "is_configured": true, 00:19:57.495 "data_offset": 2048, 00:19:57.495 "data_size": 63488 00:19:57.495 }, 00:19:57.495 { 00:19:57.495 "name": "BaseBdev2", 00:19:57.495 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:19:57.495 "is_configured": true, 00:19:57.495 "data_offset": 2048, 00:19:57.495 "data_size": 63488 00:19:57.496 } 00:19:57.496 ] 00:19:57.496 }' 00:19:57.496 11:56:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:57.496 11:56:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:58.062 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:58.321 [2024-05-14 11:56:25.307395] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:58.321 [2024-05-14 11:56:25.307438] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:58.321 [2024-05-14 11:56:25.307501] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:58.321 [2024-05-14 11:56:25.307557] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:58.321 [2024-05-14 11:56:25.307569] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19a4a50 name raid_bdev1, state offline 00:19:58.321 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.321 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # jq length 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:58.669 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:58.931 /dev/nbd0 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:58.931 1+0 records in 00:19:58.931 1+0 records out 00:19:58.931 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254987 s, 16.1 MB/s 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:58.931 11:56:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:59.190 /dev/nbd1 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:59.190 1+0 records in 00:19:59.190 1+0 records out 00:19:59.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316247 s, 13.0 MB/s 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:59.190 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:59.449 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:19:59.708 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:59.967 11:56:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:00.226 [2024-05-14 11:56:27.167729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:00.226 [2024-05-14 11:56:27.167781] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.226 [2024-05-14 11:56:27.167806] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a7ed0 00:20:00.226 [2024-05-14 11:56:27.167820] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.226 [2024-05-14 11:56:27.169523] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.226 [2024-05-14 11:56:27.169555] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:00.226 [2024-05-14 11:56:27.169632] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:00.226 [2024-05-14 11:56:27.169662] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:00.226 BaseBdev1 00:20:00.226 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:20:00.226 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:20:00.226 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:20:00.484 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:00.742 [2024-05-14 11:56:27.644992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:00.742 [2024-05-14 11:56:27.645033] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.742 [2024-05-14 11:56:27.645057] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a81d0 00:20:00.742 [2024-05-14 11:56:27.645070] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.742 [2024-05-14 11:56:27.645422] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.742 [2024-05-14 11:56:27.645440] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:00.742 [2024-05-14 11:56:27.645501] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:20:00.742 [2024-05-14 11:56:27.645514] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:20:00.742 [2024-05-14 11:56:27.645524] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:00.742 [2024-05-14 11:56:27.645540] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b535d0 name raid_bdev1, state configuring 00:20:00.742 [2024-05-14 11:56:27.645571] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:00.742 BaseBdev2 00:20:00.742 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:01.001 11:56:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:01.259 [2024-05-14 11:56:28.094189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:01.259 [2024-05-14 11:56:28.094236] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.259 [2024-05-14 11:56:28.094260] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a8c90 00:20:01.259 [2024-05-14 11:56:28.094273] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.259 [2024-05-14 11:56:28.094655] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.259 [2024-05-14 11:56:28.094673] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:01.259 [2024-05-14 11:56:28.094750] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:01.259 [2024-05-14 11:56:28.094769] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:01.260 spare 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.260 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.260 [2024-05-14 11:56:28.195094] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19a0a70 00:20:01.260 [2024-05-14 11:56:28.195110] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:01.260 [2024-05-14 11:56:28.195316] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19a4300 00:20:01.260 [2024-05-14 11:56:28.195481] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19a0a70 00:20:01.260 [2024-05-14 11:56:28.195492] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19a0a70 00:20:01.260 [2024-05-14 11:56:28.195605] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:01.518 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:01.518 "name": "raid_bdev1", 00:20:01.518 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:01.518 "strip_size_kb": 0, 00:20:01.518 "state": "online", 00:20:01.518 "raid_level": "raid1", 00:20:01.518 "superblock": true, 00:20:01.518 "num_base_bdevs": 2, 00:20:01.518 "num_base_bdevs_discovered": 2, 00:20:01.518 "num_base_bdevs_operational": 2, 00:20:01.518 "base_bdevs_list": [ 00:20:01.518 { 00:20:01.518 "name": "spare", 00:20:01.518 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:20:01.518 "is_configured": true, 00:20:01.518 "data_offset": 2048, 00:20:01.518 "data_size": 63488 00:20:01.518 }, 00:20:01.518 { 00:20:01.518 "name": "BaseBdev2", 00:20:01.518 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:01.518 "is_configured": true, 00:20:01.518 "data_offset": 2048, 00:20:01.518 "data_size": 63488 00:20:01.518 } 00:20:01.518 ] 00:20:01.518 }' 00:20:01.518 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:01.518 11:56:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.085 11:56:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:02.344 "name": "raid_bdev1", 00:20:02.344 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:02.344 "strip_size_kb": 0, 00:20:02.344 "state": "online", 00:20:02.344 "raid_level": "raid1", 00:20:02.344 "superblock": true, 00:20:02.344 "num_base_bdevs": 2, 00:20:02.344 "num_base_bdevs_discovered": 2, 00:20:02.344 "num_base_bdevs_operational": 2, 00:20:02.344 "base_bdevs_list": [ 00:20:02.344 { 00:20:02.344 "name": "spare", 00:20:02.344 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:20:02.344 "is_configured": true, 00:20:02.344 "data_offset": 2048, 00:20:02.344 "data_size": 63488 00:20:02.344 }, 00:20:02.344 { 00:20:02.344 "name": "BaseBdev2", 00:20:02.344 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:02.344 "is_configured": true, 00:20:02.344 "data_offset": 2048, 00:20:02.344 "data_size": 63488 00:20:02.344 } 00:20:02.344 ] 00:20:02.344 }' 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.344 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:02.603 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:20:02.603 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:02.862 [2024-05-14 11:56:29.722612] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.862 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.121 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:03.121 "name": "raid_bdev1", 00:20:03.121 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:03.121 "strip_size_kb": 0, 00:20:03.121 "state": "online", 00:20:03.121 "raid_level": "raid1", 00:20:03.121 "superblock": true, 00:20:03.121 "num_base_bdevs": 2, 00:20:03.121 "num_base_bdevs_discovered": 1, 00:20:03.121 "num_base_bdevs_operational": 1, 00:20:03.121 "base_bdevs_list": [ 00:20:03.121 { 00:20:03.121 "name": null, 00:20:03.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.121 "is_configured": false, 00:20:03.121 "data_offset": 2048, 00:20:03.121 "data_size": 63488 00:20:03.121 }, 00:20:03.121 { 00:20:03.121 "name": "BaseBdev2", 00:20:03.121 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:03.121 "is_configured": true, 00:20:03.121 "data_offset": 2048, 00:20:03.121 "data_size": 63488 00:20:03.121 } 00:20:03.121 ] 00:20:03.121 }' 00:20:03.121 11:56:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:03.121 11:56:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.689 11:56:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:03.689 [2024-05-14 11:56:30.717261] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:03.689 [2024-05-14 11:56:30.717423] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:03.689 [2024-05-14 11:56:30.717440] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:03.689 [2024-05-14 11:56:30.717466] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:03.689 [2024-05-14 11:56:30.722318] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ab090 00:20:03.689 [2024-05-14 11:56:30.724600] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:03.689 11:56:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # sleep 1 00:20:05.066 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:05.066 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:05.066 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:05.067 "name": "raid_bdev1", 00:20:05.067 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:05.067 "strip_size_kb": 0, 00:20:05.067 "state": "online", 00:20:05.067 "raid_level": "raid1", 00:20:05.067 "superblock": true, 00:20:05.067 "num_base_bdevs": 2, 00:20:05.067 "num_base_bdevs_discovered": 2, 00:20:05.067 "num_base_bdevs_operational": 2, 00:20:05.067 "process": { 00:20:05.067 "type": "rebuild", 00:20:05.067 "target": "spare", 00:20:05.067 "progress": { 00:20:05.067 "blocks": 22528, 00:20:05.067 "percent": 35 00:20:05.067 } 00:20:05.067 }, 00:20:05.067 "base_bdevs_list": [ 00:20:05.067 { 00:20:05.067 "name": "spare", 00:20:05.067 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:20:05.067 "is_configured": true, 00:20:05.067 "data_offset": 2048, 00:20:05.067 "data_size": 63488 00:20:05.067 }, 00:20:05.067 { 00:20:05.067 "name": "BaseBdev2", 00:20:05.067 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:05.067 "is_configured": true, 00:20:05.067 "data_offset": 2048, 00:20:05.067 "data_size": 63488 00:20:05.067 } 00:20:05.067 ] 00:20:05.067 }' 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:05.067 11:56:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:05.067 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:05.067 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:05.326 [2024-05-14 11:56:32.246992] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:05.326 [2024-05-14 11:56:32.337127] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:05.326 [2024-05-14 11:56:32.337173] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.326 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.585 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:05.585 "name": "raid_bdev1", 00:20:05.585 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:05.585 "strip_size_kb": 0, 00:20:05.585 "state": "online", 00:20:05.585 "raid_level": "raid1", 00:20:05.585 "superblock": true, 00:20:05.585 "num_base_bdevs": 2, 00:20:05.585 "num_base_bdevs_discovered": 1, 00:20:05.585 "num_base_bdevs_operational": 1, 00:20:05.585 "base_bdevs_list": [ 00:20:05.585 { 00:20:05.585 "name": null, 00:20:05.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.585 "is_configured": false, 00:20:05.585 "data_offset": 2048, 00:20:05.585 "data_size": 63488 00:20:05.585 }, 00:20:05.585 { 00:20:05.585 "name": "BaseBdev2", 00:20:05.585 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:05.585 "is_configured": true, 00:20:05.585 "data_offset": 2048, 00:20:05.585 "data_size": 63488 00:20:05.585 } 00:20:05.585 ] 00:20:05.585 }' 00:20:05.585 11:56:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:05.585 11:56:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.152 11:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:06.412 [2024-05-14 11:56:33.424804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:06.412 [2024-05-14 11:56:33.424857] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.412 [2024-05-14 11:56:33.424881] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a6a50 00:20:06.412 [2024-05-14 11:56:33.424894] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.412 [2024-05-14 11:56:33.425286] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.412 [2024-05-14 11:56:33.425305] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:06.412 [2024-05-14 11:56:33.425386] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:06.412 [2024-05-14 11:56:33.425408] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:06.412 [2024-05-14 11:56:33.425419] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:06.412 [2024-05-14 11:56:33.425437] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:06.412 [2024-05-14 11:56:33.430380] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16b1090 00:20:06.412 spare 00:20:06.412 [2024-05-14 11:56:33.431879] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:06.412 11:56:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # sleep 1 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:07.790 "name": "raid_bdev1", 00:20:07.790 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:07.790 "strip_size_kb": 0, 00:20:07.790 "state": "online", 00:20:07.790 "raid_level": "raid1", 00:20:07.790 "superblock": true, 00:20:07.790 "num_base_bdevs": 2, 00:20:07.790 "num_base_bdevs_discovered": 2, 00:20:07.790 "num_base_bdevs_operational": 2, 00:20:07.790 "process": { 00:20:07.790 "type": "rebuild", 00:20:07.790 "target": "spare", 00:20:07.790 "progress": { 00:20:07.790 "blocks": 24576, 00:20:07.790 "percent": 38 00:20:07.790 } 00:20:07.790 }, 00:20:07.790 "base_bdevs_list": [ 00:20:07.790 { 00:20:07.790 "name": "spare", 00:20:07.790 "uuid": "3ef6a4f0-0337-5009-8be4-f5a9ed3f079c", 00:20:07.790 "is_configured": true, 00:20:07.790 "data_offset": 2048, 00:20:07.790 "data_size": 63488 00:20:07.790 }, 00:20:07.790 { 00:20:07.790 "name": "BaseBdev2", 00:20:07.790 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:07.790 "is_configured": true, 00:20:07.790 "data_offset": 2048, 00:20:07.790 "data_size": 63488 00:20:07.790 } 00:20:07.790 ] 00:20:07.790 }' 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:07.790 11:56:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:08.049 [2024-05-14 11:56:34.995057] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:08.049 [2024-05-14 11:56:35.044608] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:08.049 [2024-05-14 11:56:35.044654] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.049 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.308 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:08.308 "name": "raid_bdev1", 00:20:08.308 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:08.308 "strip_size_kb": 0, 00:20:08.308 "state": "online", 00:20:08.308 "raid_level": "raid1", 00:20:08.308 "superblock": true, 00:20:08.308 "num_base_bdevs": 2, 00:20:08.308 "num_base_bdevs_discovered": 1, 00:20:08.308 "num_base_bdevs_operational": 1, 00:20:08.308 "base_bdevs_list": [ 00:20:08.308 { 00:20:08.308 "name": null, 00:20:08.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.308 "is_configured": false, 00:20:08.308 "data_offset": 2048, 00:20:08.308 "data_size": 63488 00:20:08.308 }, 00:20:08.308 { 00:20:08.308 "name": "BaseBdev2", 00:20:08.308 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:08.308 "is_configured": true, 00:20:08.308 "data_offset": 2048, 00:20:08.308 "data_size": 63488 00:20:08.308 } 00:20:08.308 ] 00:20:08.308 }' 00:20:08.308 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:08.308 11:56:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.876 11:56:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.134 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:09.134 "name": "raid_bdev1", 00:20:09.134 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:09.134 "strip_size_kb": 0, 00:20:09.134 "state": "online", 00:20:09.134 "raid_level": "raid1", 00:20:09.134 "superblock": true, 00:20:09.134 "num_base_bdevs": 2, 00:20:09.134 "num_base_bdevs_discovered": 1, 00:20:09.134 "num_base_bdevs_operational": 1, 00:20:09.134 "base_bdevs_list": [ 00:20:09.134 { 00:20:09.134 "name": null, 00:20:09.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.134 "is_configured": false, 00:20:09.134 "data_offset": 2048, 00:20:09.134 "data_size": 63488 00:20:09.134 }, 00:20:09.134 { 00:20:09.134 "name": "BaseBdev2", 00:20:09.134 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:09.134 "is_configured": true, 00:20:09.134 "data_offset": 2048, 00:20:09.134 "data_size": 63488 00:20:09.134 } 00:20:09.134 ] 00:20:09.134 }' 00:20:09.134 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:09.134 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:09.134 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:09.393 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:09.393 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:09.393 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:09.652 [2024-05-14 11:56:36.693637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:09.652 [2024-05-14 11:56:36.693688] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.652 [2024-05-14 11:56:36.693708] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19a47a0 00:20:09.652 [2024-05-14 11:56:36.693721] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.652 [2024-05-14 11:56:36.694096] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.652 [2024-05-14 11:56:36.694115] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:09.652 [2024-05-14 11:56:36.694181] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:09.652 [2024-05-14 11:56:36.694194] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:09.652 [2024-05-14 11:56:36.694204] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:09.652 BaseBdev1 00:20:09.652 11:56:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@786 -- # sleep 1 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:11.023 "name": "raid_bdev1", 00:20:11.023 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:11.023 "strip_size_kb": 0, 00:20:11.023 "state": "online", 00:20:11.023 "raid_level": "raid1", 00:20:11.023 "superblock": true, 00:20:11.023 "num_base_bdevs": 2, 00:20:11.023 "num_base_bdevs_discovered": 1, 00:20:11.023 "num_base_bdevs_operational": 1, 00:20:11.023 "base_bdevs_list": [ 00:20:11.023 { 00:20:11.023 "name": null, 00:20:11.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.023 "is_configured": false, 00:20:11.023 "data_offset": 2048, 00:20:11.023 "data_size": 63488 00:20:11.023 }, 00:20:11.023 { 00:20:11.023 "name": "BaseBdev2", 00:20:11.023 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:11.023 "is_configured": true, 00:20:11.023 "data_offset": 2048, 00:20:11.023 "data_size": 63488 00:20:11.023 } 00:20:11.023 ] 00:20:11.023 }' 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:11.023 11:56:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.591 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:11.849 "name": "raid_bdev1", 00:20:11.849 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:11.849 "strip_size_kb": 0, 00:20:11.849 "state": "online", 00:20:11.849 "raid_level": "raid1", 00:20:11.849 "superblock": true, 00:20:11.849 "num_base_bdevs": 2, 00:20:11.849 "num_base_bdevs_discovered": 1, 00:20:11.849 "num_base_bdevs_operational": 1, 00:20:11.849 "base_bdevs_list": [ 00:20:11.849 { 00:20:11.849 "name": null, 00:20:11.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.849 "is_configured": false, 00:20:11.849 "data_offset": 2048, 00:20:11.849 "data_size": 63488 00:20:11.849 }, 00:20:11.849 { 00:20:11.849 "name": "BaseBdev2", 00:20:11.849 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:11.849 "is_configured": true, 00:20:11.849 "data_offset": 2048, 00:20:11.849 "data_size": 63488 00:20:11.849 } 00:20:11.849 ] 00:20:11.849 }' 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:11.849 11:56:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:12.109 [2024-05-14 11:56:39.128277] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:12.109 [2024-05-14 11:56:39.128421] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:12.109 [2024-05-14 11:56:39.128437] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:12.109 request: 00:20:12.109 { 00:20:12.109 "raid_bdev": "raid_bdev1", 00:20:12.109 "base_bdev": "BaseBdev1", 00:20:12.109 "method": "bdev_raid_add_base_bdev", 00:20:12.109 "req_id": 1 00:20:12.109 } 00:20:12.109 Got JSON-RPC error response 00:20:12.109 response: 00:20:12.109 { 00:20:12.109 "code": -22, 00:20:12.109 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:12.109 } 00:20:12.109 11:56:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:12.109 11:56:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:12.109 11:56:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:12.109 11:56:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:12.109 11:56:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # sleep 1 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:13.486 "name": "raid_bdev1", 00:20:13.486 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:13.486 "strip_size_kb": 0, 00:20:13.486 "state": "online", 00:20:13.486 "raid_level": "raid1", 00:20:13.486 "superblock": true, 00:20:13.486 "num_base_bdevs": 2, 00:20:13.486 "num_base_bdevs_discovered": 1, 00:20:13.486 "num_base_bdevs_operational": 1, 00:20:13.486 "base_bdevs_list": [ 00:20:13.486 { 00:20:13.486 "name": null, 00:20:13.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.486 "is_configured": false, 00:20:13.486 "data_offset": 2048, 00:20:13.486 "data_size": 63488 00:20:13.486 }, 00:20:13.486 { 00:20:13.486 "name": "BaseBdev2", 00:20:13.486 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:13.486 "is_configured": true, 00:20:13.486 "data_offset": 2048, 00:20:13.486 "data_size": 63488 00:20:13.486 } 00:20:13.486 ] 00:20:13.486 }' 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:13.486 11:56:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.053 11:56:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.311 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:14.311 "name": "raid_bdev1", 00:20:14.311 "uuid": "d4164f0a-0d51-42dc-bfe4-71aeb8d067af", 00:20:14.311 "strip_size_kb": 0, 00:20:14.311 "state": "online", 00:20:14.311 "raid_level": "raid1", 00:20:14.311 "superblock": true, 00:20:14.311 "num_base_bdevs": 2, 00:20:14.311 "num_base_bdevs_discovered": 1, 00:20:14.311 "num_base_bdevs_operational": 1, 00:20:14.311 "base_bdevs_list": [ 00:20:14.311 { 00:20:14.311 "name": null, 00:20:14.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.311 "is_configured": false, 00:20:14.311 "data_offset": 2048, 00:20:14.311 "data_size": 63488 00:20:14.311 }, 00:20:14.311 { 00:20:14.311 "name": "BaseBdev2", 00:20:14.311 "uuid": "3dd12009-85ca-5e61-bd65-53832a43849f", 00:20:14.312 "is_configured": true, 00:20:14.312 "data_offset": 2048, 00:20:14.312 "data_size": 63488 00:20:14.312 } 00:20:14.312 ] 00:20:14.312 }' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # killprocess 1753260 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1753260 ']' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 1753260 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1753260 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1753260' 00:20:14.312 killing process with pid 1753260 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 1753260 00:20:14.312 Received shutdown signal, test time was about 60.000000 seconds 00:20:14.312 00:20:14.312 Latency(us) 00:20:14.312 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:14.312 =================================================================================================================== 00:20:14.312 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:14.312 [2024-05-14 11:56:41.362440] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:14.312 [2024-05-14 11:56:41.362545] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:14.312 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 1753260 00:20:14.312 [2024-05-14 11:56:41.362591] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:14.312 [2024-05-14 11:56:41.362605] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19a0a70 name raid_bdev1, state offline 00:20:14.312 [2024-05-14 11:56:41.393747] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:14.571 11:56:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@797 -- # return 0 00:20:14.571 00:20:14.571 real 0m35.130s 00:20:14.571 user 0m52.232s 00:20:14.571 sys 0m6.151s 00:20:14.571 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:14.571 11:56:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.571 ************************************ 00:20:14.571 END TEST raid_rebuild_test_sb 00:20:14.571 ************************************ 00:20:14.830 11:56:41 bdev_raid -- bdev/bdev_raid.sh@825 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:20:14.830 11:56:41 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:14.830 11:56:41 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:14.830 11:56:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:14.830 ************************************ 00:20:14.830 START TEST raid_rebuild_test_io 00:20:14.830 ************************************ 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false true true 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # raid_pid=1758264 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 1758264 /var/tmp/spdk-raid.sock 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 1758264 ']' 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:14.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:14.830 11:56:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:14.830 [2024-05-14 11:56:41.780637] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:20:14.830 [2024-05-14 11:56:41.780699] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1758264 ] 00:20:14.830 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:14.830 Zero copy mechanism will not be used. 00:20:14.830 [2024-05-14 11:56:41.909994] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:15.088 [2024-05-14 11:56:42.018407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:15.088 [2024-05-14 11:56:42.081812] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:15.088 [2024-05-14 11:56:42.081844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:15.655 11:56:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:15.655 11:56:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:20:15.655 11:56:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:15.655 11:56:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:15.914 BaseBdev1_malloc 00:20:15.914 11:56:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:16.172 [2024-05-14 11:56:43.202813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:16.173 [2024-05-14 11:56:43.202856] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.173 [2024-05-14 11:56:43.202878] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a40960 00:20:16.173 [2024-05-14 11:56:43.202891] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.173 [2024-05-14 11:56:43.204544] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.173 [2024-05-14 11:56:43.204571] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:16.173 BaseBdev1 00:20:16.173 11:56:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:16.173 11:56:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:16.432 BaseBdev2_malloc 00:20:16.432 11:56:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:16.691 [2024-05-14 11:56:43.694248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:16.691 [2024-05-14 11:56:43.694293] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.691 [2024-05-14 11:56:43.694317] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bf3b40 00:20:16.691 [2024-05-14 11:56:43.694330] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.691 [2024-05-14 11:56:43.695880] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.691 [2024-05-14 11:56:43.695913] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:16.691 BaseBdev2 00:20:16.691 11:56:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:17.017 spare_malloc 00:20:17.017 11:56:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:17.275 spare_delay 00:20:17.275 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:17.534 [2024-05-14 11:56:44.425986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:17.534 [2024-05-14 11:56:44.426030] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.534 [2024-05-14 11:56:44.426049] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3c700 00:20:17.534 [2024-05-14 11:56:44.426061] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.534 [2024-05-14 11:56:44.427614] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.534 [2024-05-14 11:56:44.427641] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:17.534 spare 00:20:17.534 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:17.793 [2024-05-14 11:56:44.662628] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:17.793 [2024-05-14 11:56:44.663935] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:17.793 [2024-05-14 11:56:44.664012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3ba50 00:20:17.793 [2024-05-14 11:56:44.664023] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:17.793 [2024-05-14 11:56:44.664229] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a392d0 00:20:17.793 [2024-05-14 11:56:44.664373] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3ba50 00:20:17.793 [2024-05-14 11:56:44.664383] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a3ba50 00:20:17.793 [2024-05-14 11:56:44.664508] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:17.793 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:17.794 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:17.794 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:17.794 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.794 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.052 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:18.052 "name": "raid_bdev1", 00:20:18.052 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:18.052 "strip_size_kb": 0, 00:20:18.052 "state": "online", 00:20:18.052 "raid_level": "raid1", 00:20:18.052 "superblock": false, 00:20:18.052 "num_base_bdevs": 2, 00:20:18.052 "num_base_bdevs_discovered": 2, 00:20:18.052 "num_base_bdevs_operational": 2, 00:20:18.052 "base_bdevs_list": [ 00:20:18.052 { 00:20:18.052 "name": "BaseBdev1", 00:20:18.052 "uuid": "2d284de0-7ead-577e-8b01-dc39a419ca25", 00:20:18.052 "is_configured": true, 00:20:18.052 "data_offset": 0, 00:20:18.052 "data_size": 65536 00:20:18.052 }, 00:20:18.052 { 00:20:18.052 "name": "BaseBdev2", 00:20:18.052 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:18.052 "is_configured": true, 00:20:18.052 "data_offset": 0, 00:20:18.052 "data_size": 65536 00:20:18.052 } 00:20:18.052 ] 00:20:18.052 }' 00:20:18.052 11:56:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:18.052 11:56:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:18.620 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:18.620 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:20:18.620 [2024-05-14 11:56:45.705665] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:18.879 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:20:18.879 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.879 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:19.138 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:20:19.138 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:20:19.138 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:19.138 11:56:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:19.138 [2024-05-14 11:56:46.076476] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3c990 00:20:19.138 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:19.138 Zero copy mechanism will not be used. 00:20:19.138 Running I/O for 60 seconds... 00:20:19.139 [2024-05-14 11:56:46.188712] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:19.139 [2024-05-14 11:56:46.188890] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a3c990 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.139 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.398 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:19.398 "name": "raid_bdev1", 00:20:19.398 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:19.398 "strip_size_kb": 0, 00:20:19.398 "state": "online", 00:20:19.398 "raid_level": "raid1", 00:20:19.398 "superblock": false, 00:20:19.398 "num_base_bdevs": 2, 00:20:19.398 "num_base_bdevs_discovered": 1, 00:20:19.398 "num_base_bdevs_operational": 1, 00:20:19.398 "base_bdevs_list": [ 00:20:19.398 { 00:20:19.398 "name": null, 00:20:19.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.398 "is_configured": false, 00:20:19.398 "data_offset": 0, 00:20:19.398 "data_size": 65536 00:20:19.398 }, 00:20:19.398 { 00:20:19.398 "name": "BaseBdev2", 00:20:19.398 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:19.398 "is_configured": true, 00:20:19.398 "data_offset": 0, 00:20:19.398 "data_size": 65536 00:20:19.398 } 00:20:19.398 ] 00:20:19.398 }' 00:20:19.398 11:56:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:19.398 11:56:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:19.965 11:56:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:20.225 [2024-05-14 11:56:47.253257] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:20.225 [2024-05-14 11:56:47.287422] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1748090 00:20:20.225 [2024-05-14 11:56:47.289803] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:20.225 11:56:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:20:20.484 [2024-05-14 11:56:47.416340] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:20.484 [2024-05-14 11:56:47.416834] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:20.743 [2024-05-14 11:56:47.642354] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:21.003 [2024-05-14 11:56:47.882325] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:21.263 [2024-05-14 11:56:48.111304] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:21.263 [2024-05-14 11:56:48.111575] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.263 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.522 [2024-05-14 11:56:48.477396] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:21.522 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:21.522 "name": "raid_bdev1", 00:20:21.522 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:21.522 "strip_size_kb": 0, 00:20:21.522 "state": "online", 00:20:21.522 "raid_level": "raid1", 00:20:21.522 "superblock": false, 00:20:21.522 "num_base_bdevs": 2, 00:20:21.522 "num_base_bdevs_discovered": 2, 00:20:21.522 "num_base_bdevs_operational": 2, 00:20:21.522 "process": { 00:20:21.523 "type": "rebuild", 00:20:21.523 "target": "spare", 00:20:21.523 "progress": { 00:20:21.523 "blocks": 14336, 00:20:21.523 "percent": 21 00:20:21.523 } 00:20:21.523 }, 00:20:21.523 "base_bdevs_list": [ 00:20:21.523 { 00:20:21.523 "name": "spare", 00:20:21.523 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:21.523 "is_configured": true, 00:20:21.523 "data_offset": 0, 00:20:21.523 "data_size": 65536 00:20:21.523 }, 00:20:21.523 { 00:20:21.523 "name": "BaseBdev2", 00:20:21.523 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:21.523 "is_configured": true, 00:20:21.523 "data_offset": 0, 00:20:21.523 "data_size": 65536 00:20:21.523 } 00:20:21.523 ] 00:20:21.523 }' 00:20:21.523 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:21.523 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:21.523 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:21.782 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:21.782 11:56:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:21.782 [2024-05-14 11:56:48.847176] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:22.041 [2024-05-14 11:56:48.875221] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:22.041 [2024-05-14 11:56:48.976844] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:22.041 [2024-05-14 11:56:48.986288] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:22.041 [2024-05-14 11:56:49.004525] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:22.041 [2024-05-14 11:56:49.011159] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a3c990 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.041 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.300 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:22.300 "name": "raid_bdev1", 00:20:22.300 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:22.300 "strip_size_kb": 0, 00:20:22.300 "state": "online", 00:20:22.300 "raid_level": "raid1", 00:20:22.300 "superblock": false, 00:20:22.300 "num_base_bdevs": 2, 00:20:22.300 "num_base_bdevs_discovered": 1, 00:20:22.300 "num_base_bdevs_operational": 1, 00:20:22.300 "base_bdevs_list": [ 00:20:22.300 { 00:20:22.300 "name": null, 00:20:22.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.300 "is_configured": false, 00:20:22.300 "data_offset": 0, 00:20:22.300 "data_size": 65536 00:20:22.300 }, 00:20:22.300 { 00:20:22.300 "name": "BaseBdev2", 00:20:22.300 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:22.300 "is_configured": true, 00:20:22.300 "data_offset": 0, 00:20:22.300 "data_size": 65536 00:20:22.300 } 00:20:22.300 ] 00:20:22.300 }' 00:20:22.300 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:22.300 11:56:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.869 11:56:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.129 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:23.129 "name": "raid_bdev1", 00:20:23.129 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:23.129 "strip_size_kb": 0, 00:20:23.129 "state": "online", 00:20:23.129 "raid_level": "raid1", 00:20:23.129 "superblock": false, 00:20:23.129 "num_base_bdevs": 2, 00:20:23.129 "num_base_bdevs_discovered": 1, 00:20:23.129 "num_base_bdevs_operational": 1, 00:20:23.129 "base_bdevs_list": [ 00:20:23.129 { 00:20:23.129 "name": null, 00:20:23.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:23.129 "is_configured": false, 00:20:23.129 "data_offset": 0, 00:20:23.129 "data_size": 65536 00:20:23.129 }, 00:20:23.129 { 00:20:23.129 "name": "BaseBdev2", 00:20:23.129 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:23.129 "is_configured": true, 00:20:23.129 "data_offset": 0, 00:20:23.129 "data_size": 65536 00:20:23.129 } 00:20:23.129 ] 00:20:23.129 }' 00:20:23.129 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:23.388 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:23.388 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:23.388 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:23.388 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:23.648 [2024-05-14 11:56:50.488810] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:23.648 [2024-05-14 11:56:50.540569] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3ded0 00:20:23.648 [2024-05-14 11:56:50.542089] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:23.648 11:56:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:20:23.648 [2024-05-14 11:56:50.669448] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:23.648 [2024-05-14 11:56:50.669897] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:23.910 [2024-05-14 11:56:50.881415] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:23.910 [2024-05-14 11:56:50.881651] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:24.528 [2024-05-14 11:56:51.355816] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.528 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.787 [2024-05-14 11:56:51.713337] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:24.787 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:24.787 "name": "raid_bdev1", 00:20:24.787 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:24.787 "strip_size_kb": 0, 00:20:24.787 "state": "online", 00:20:24.787 "raid_level": "raid1", 00:20:24.787 "superblock": false, 00:20:24.787 "num_base_bdevs": 2, 00:20:24.787 "num_base_bdevs_discovered": 2, 00:20:24.787 "num_base_bdevs_operational": 2, 00:20:24.787 "process": { 00:20:24.787 "type": "rebuild", 00:20:24.787 "target": "spare", 00:20:24.787 "progress": { 00:20:24.787 "blocks": 16384, 00:20:24.787 "percent": 25 00:20:24.787 } 00:20:24.787 }, 00:20:24.787 "base_bdevs_list": [ 00:20:24.787 { 00:20:24.787 "name": "spare", 00:20:24.787 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:24.787 "is_configured": true, 00:20:24.787 "data_offset": 0, 00:20:24.787 "data_size": 65536 00:20:24.787 }, 00:20:24.787 { 00:20:24.787 "name": "BaseBdev2", 00:20:24.787 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:24.787 "is_configured": true, 00:20:24.787 "data_offset": 0, 00:20:24.787 "data_size": 65536 00:20:24.787 } 00:20:24.787 ] 00:20:24.787 }' 00:20:24.787 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:24.787 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.787 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # local timeout=667 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.046 11:56:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.046 [2024-05-14 11:56:52.067264] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:25.046 [2024-05-14 11:56:52.067502] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:25.306 "name": "raid_bdev1", 00:20:25.306 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:25.306 "strip_size_kb": 0, 00:20:25.306 "state": "online", 00:20:25.306 "raid_level": "raid1", 00:20:25.306 "superblock": false, 00:20:25.306 "num_base_bdevs": 2, 00:20:25.306 "num_base_bdevs_discovered": 2, 00:20:25.306 "num_base_bdevs_operational": 2, 00:20:25.306 "process": { 00:20:25.306 "type": "rebuild", 00:20:25.306 "target": "spare", 00:20:25.306 "progress": { 00:20:25.306 "blocks": 22528, 00:20:25.306 "percent": 34 00:20:25.306 } 00:20:25.306 }, 00:20:25.306 "base_bdevs_list": [ 00:20:25.306 { 00:20:25.306 "name": "spare", 00:20:25.306 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:25.306 "is_configured": true, 00:20:25.306 "data_offset": 0, 00:20:25.306 "data_size": 65536 00:20:25.306 }, 00:20:25.306 { 00:20:25.306 "name": "BaseBdev2", 00:20:25.306 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:25.306 "is_configured": true, 00:20:25.306 "data_offset": 0, 00:20:25.306 "data_size": 65536 00:20:25.306 } 00:20:25.306 ] 00:20:25.306 }' 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:25.306 11:56:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:25.567 [2024-05-14 11:56:52.555989] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:26.134 [2024-05-14 11:56:52.928228] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:26.393 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:26.394 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.394 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.394 [2024-05-14 11:56:53.377232] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:26.653 "name": "raid_bdev1", 00:20:26.653 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:26.653 "strip_size_kb": 0, 00:20:26.653 "state": "online", 00:20:26.653 "raid_level": "raid1", 00:20:26.653 "superblock": false, 00:20:26.653 "num_base_bdevs": 2, 00:20:26.653 "num_base_bdevs_discovered": 2, 00:20:26.653 "num_base_bdevs_operational": 2, 00:20:26.653 "process": { 00:20:26.653 "type": "rebuild", 00:20:26.653 "target": "spare", 00:20:26.653 "progress": { 00:20:26.653 "blocks": 40960, 00:20:26.653 "percent": 62 00:20:26.653 } 00:20:26.653 }, 00:20:26.653 "base_bdevs_list": [ 00:20:26.653 { 00:20:26.653 "name": "spare", 00:20:26.653 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:26.653 "is_configured": true, 00:20:26.653 "data_offset": 0, 00:20:26.653 "data_size": 65536 00:20:26.653 }, 00:20:26.653 { 00:20:26.653 "name": "BaseBdev2", 00:20:26.653 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:26.653 "is_configured": true, 00:20:26.653 "data_offset": 0, 00:20:26.653 "data_size": 65536 00:20:26.653 } 00:20:26.653 ] 00:20:26.653 }' 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:26.653 11:56:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:26.912 [2024-05-14 11:56:53.826920] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:27.171 [2024-05-14 11:56:54.168236] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:27.430 [2024-05-14 11:56:54.500420] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.689 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.689 [2024-05-14 11:56:54.629384] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:27.948 "name": "raid_bdev1", 00:20:27.948 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:27.948 "strip_size_kb": 0, 00:20:27.948 "state": "online", 00:20:27.948 "raid_level": "raid1", 00:20:27.948 "superblock": false, 00:20:27.948 "num_base_bdevs": 2, 00:20:27.948 "num_base_bdevs_discovered": 2, 00:20:27.948 "num_base_bdevs_operational": 2, 00:20:27.948 "process": { 00:20:27.948 "type": "rebuild", 00:20:27.948 "target": "spare", 00:20:27.948 "progress": { 00:20:27.948 "blocks": 61440, 00:20:27.948 "percent": 93 00:20:27.948 } 00:20:27.948 }, 00:20:27.948 "base_bdevs_list": [ 00:20:27.948 { 00:20:27.948 "name": "spare", 00:20:27.948 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:27.948 "is_configured": true, 00:20:27.948 "data_offset": 0, 00:20:27.948 "data_size": 65536 00:20:27.948 }, 00:20:27.948 { 00:20:27.948 "name": "BaseBdev2", 00:20:27.948 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:27.948 "is_configured": true, 00:20:27.948 "data_offset": 0, 00:20:27.948 "data_size": 65536 00:20:27.948 } 00:20:27.948 ] 00:20:27.948 }' 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.948 11:56:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:27.948 [2024-05-14 11:56:54.979274] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:28.208 [2024-05-14 11:56:55.079597] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:28.208 [2024-05-14 11:56:55.081237] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.146 11:56:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.146 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:29.146 "name": "raid_bdev1", 00:20:29.146 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:29.146 "strip_size_kb": 0, 00:20:29.146 "state": "online", 00:20:29.146 "raid_level": "raid1", 00:20:29.146 "superblock": false, 00:20:29.146 "num_base_bdevs": 2, 00:20:29.146 "num_base_bdevs_discovered": 2, 00:20:29.146 "num_base_bdevs_operational": 2, 00:20:29.146 "base_bdevs_list": [ 00:20:29.146 { 00:20:29.146 "name": "spare", 00:20:29.146 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:29.146 "is_configured": true, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 65536 00:20:29.146 }, 00:20:29.146 { 00:20:29.146 "name": "BaseBdev2", 00:20:29.146 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:29.146 "is_configured": true, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 65536 00:20:29.146 } 00:20:29.146 ] 00:20:29.146 }' 00:20:29.146 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:29.146 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:29.146 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # break 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.406 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:29.665 "name": "raid_bdev1", 00:20:29.665 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:29.665 "strip_size_kb": 0, 00:20:29.665 "state": "online", 00:20:29.665 "raid_level": "raid1", 00:20:29.665 "superblock": false, 00:20:29.665 "num_base_bdevs": 2, 00:20:29.665 "num_base_bdevs_discovered": 2, 00:20:29.665 "num_base_bdevs_operational": 2, 00:20:29.665 "base_bdevs_list": [ 00:20:29.665 { 00:20:29.665 "name": "spare", 00:20:29.665 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:29.665 "is_configured": true, 00:20:29.665 "data_offset": 0, 00:20:29.665 "data_size": 65536 00:20:29.665 }, 00:20:29.665 { 00:20:29.665 "name": "BaseBdev2", 00:20:29.665 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:29.665 "is_configured": true, 00:20:29.665 "data_offset": 0, 00:20:29.665 "data_size": 65536 00:20:29.665 } 00:20:29.665 ] 00:20:29.665 }' 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.665 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.925 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:29.925 "name": "raid_bdev1", 00:20:29.925 "uuid": "9166b3e5-df61-487e-8364-b43aaa3fd000", 00:20:29.925 "strip_size_kb": 0, 00:20:29.925 "state": "online", 00:20:29.925 "raid_level": "raid1", 00:20:29.925 "superblock": false, 00:20:29.925 "num_base_bdevs": 2, 00:20:29.925 "num_base_bdevs_discovered": 2, 00:20:29.925 "num_base_bdevs_operational": 2, 00:20:29.925 "base_bdevs_list": [ 00:20:29.925 { 00:20:29.925 "name": "spare", 00:20:29.925 "uuid": "2b50abd8-b231-5663-b170-e67335d4df10", 00:20:29.925 "is_configured": true, 00:20:29.925 "data_offset": 0, 00:20:29.925 "data_size": 65536 00:20:29.925 }, 00:20:29.925 { 00:20:29.925 "name": "BaseBdev2", 00:20:29.925 "uuid": "ac9df7de-3bf2-5821-a549-fa661d213394", 00:20:29.925 "is_configured": true, 00:20:29.925 "data_offset": 0, 00:20:29.925 "data_size": 65536 00:20:29.925 } 00:20:29.925 ] 00:20:29.925 }' 00:20:29.925 11:56:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:29.925 11:56:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:30.493 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:30.752 [2024-05-14 11:56:57.664784] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:30.752 [2024-05-14 11:56:57.664819] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:30.752 00:20:30.752 Latency(us) 00:20:30.752 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:30.752 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:30.752 raid_bdev1 : 11.61 101.69 305.07 0.00 0.00 13766.23 299.19 119446.48 00:20:30.752 =================================================================================================================== 00:20:30.752 Total : 101.69 305.07 0.00 0.00 13766.23 299.19 119446.48 00:20:30.752 [2024-05-14 11:56:57.724866] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:30.752 [2024-05-14 11:56:57.724895] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:30.752 [2024-05-14 11:56:57.724970] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:30.752 [2024-05-14 11:56:57.724982] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3ba50 name raid_bdev1, state offline 00:20:30.752 0 00:20:30.752 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # jq length 00:20:30.752 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.011 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:20:31.011 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:31.012 11:56:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:31.271 /dev/nbd0 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:31.271 1+0 records in 00:20:31.271 1+0 records out 00:20:31.271 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267604 s, 15.3 MB/s 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev2 ']' 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:31.271 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:20:31.531 /dev/nbd1 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:31.531 1+0 records in 00:20:31.531 1+0 records out 00:20:31.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024915 s, 16.4 MB/s 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:31.531 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:31.790 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:32.049 11:56:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:32.308 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:32.308 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:32.308 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:32.308 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@795 -- # killprocess 1758264 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 1758264 ']' 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 1758264 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1758264 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1758264' 00:20:32.309 killing process with pid 1758264 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 1758264 00:20:32.309 Received shutdown signal, test time was about 13.111953 seconds 00:20:32.309 00:20:32.309 Latency(us) 00:20:32.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:32.309 =================================================================================================================== 00:20:32.309 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:32.309 [2024-05-14 11:56:59.222660] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:32.309 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 1758264 00:20:32.309 [2024-05-14 11:56:59.244370] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@797 -- # return 0 00:20:32.569 00:20:32.569 real 0m17.757s 00:20:32.569 user 0m26.976s 00:20:32.569 sys 0m2.767s 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:32.569 ************************************ 00:20:32.569 END TEST raid_rebuild_test_io 00:20:32.569 ************************************ 00:20:32.569 11:56:59 bdev_raid -- bdev/bdev_raid.sh@826 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:20:32.569 11:56:59 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:32.569 11:56:59 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:32.569 11:56:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:32.569 ************************************ 00:20:32.569 START TEST raid_rebuild_test_sb_io 00:20:32.569 ************************************ 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true true true 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # raid_pid=1760839 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 1760839 /var/tmp/spdk-raid.sock 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 1760839 ']' 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:32.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:32.569 11:56:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:32.569 [2024-05-14 11:56:59.624676] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:20:32.569 [2024-05-14 11:56:59.624727] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1760839 ] 00:20:32.569 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:32.569 Zero copy mechanism will not be used. 00:20:32.829 [2024-05-14 11:56:59.744815] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:32.829 [2024-05-14 11:56:59.842046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:32.829 [2024-05-14 11:56:59.900378] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:32.829 [2024-05-14 11:56:59.900425] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.766 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:33.766 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:20:33.766 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:33.766 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:33.766 BaseBdev1_malloc 00:20:33.766 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:34.025 [2024-05-14 11:57:00.964036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:34.025 [2024-05-14 11:57:00.964089] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.025 [2024-05-14 11:57:00.964113] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc5b960 00:20:34.025 [2024-05-14 11:57:00.964126] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.025 [2024-05-14 11:57:00.965900] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.025 [2024-05-14 11:57:00.965932] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:34.025 BaseBdev1 00:20:34.025 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:20:34.025 11:57:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:34.284 BaseBdev2_malloc 00:20:34.285 11:57:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:34.544 [2024-05-14 11:57:01.455074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:34.544 [2024-05-14 11:57:01.455125] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.544 [2024-05-14 11:57:01.455148] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0eb40 00:20:34.544 [2024-05-14 11:57:01.455162] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.544 [2024-05-14 11:57:01.456782] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.544 [2024-05-14 11:57:01.456811] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:34.544 BaseBdev2 00:20:34.544 11:57:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:34.803 spare_malloc 00:20:34.803 11:57:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:35.063 spare_delay 00:20:35.063 11:57:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:35.322 [2024-05-14 11:57:02.190175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:35.322 [2024-05-14 11:57:02.190229] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.322 [2024-05-14 11:57:02.190248] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc57700 00:20:35.322 [2024-05-14 11:57:02.190261] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.322 [2024-05-14 11:57:02.191901] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.322 [2024-05-14 11:57:02.191932] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:35.322 spare 00:20:35.322 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:35.582 [2024-05-14 11:57:02.430844] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:35.582 [2024-05-14 11:57:02.432212] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:35.582 [2024-05-14 11:57:02.432388] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc56a50 00:20:35.582 [2024-05-14 11:57:02.432419] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:35.582 [2024-05-14 11:57:02.432632] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc542d0 00:20:35.582 [2024-05-14 11:57:02.432784] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc56a50 00:20:35.582 [2024-05-14 11:57:02.432794] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc56a50 00:20:35.582 [2024-05-14 11:57:02.432904] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:35.582 "name": "raid_bdev1", 00:20:35.582 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:35.582 "strip_size_kb": 0, 00:20:35.582 "state": "online", 00:20:35.582 "raid_level": "raid1", 00:20:35.582 "superblock": true, 00:20:35.582 "num_base_bdevs": 2, 00:20:35.582 "num_base_bdevs_discovered": 2, 00:20:35.582 "num_base_bdevs_operational": 2, 00:20:35.582 "base_bdevs_list": [ 00:20:35.582 { 00:20:35.582 "name": "BaseBdev1", 00:20:35.582 "uuid": "13d71d81-c4f9-5f81-b1bf-1dfe96975f4d", 00:20:35.582 "is_configured": true, 00:20:35.582 "data_offset": 2048, 00:20:35.582 "data_size": 63488 00:20:35.582 }, 00:20:35.582 { 00:20:35.582 "name": "BaseBdev2", 00:20:35.582 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:35.582 "is_configured": true, 00:20:35.582 "data_offset": 2048, 00:20:35.582 "data_size": 63488 00:20:35.582 } 00:20:35.582 ] 00:20:35.582 }' 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:35.582 11:57:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:36.151 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:20:36.151 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:36.410 [2024-05-14 11:57:03.425662] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:36.410 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:20:36.410 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.410 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:36.669 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:20:36.669 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:20:36.669 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:36.669 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:36.669 [2024-05-14 11:57:03.712265] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc5a5e0 00:20:36.669 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:36.669 Zero copy mechanism will not be used. 00:20:36.669 Running I/O for 60 seconds... 00:20:36.928 [2024-05-14 11:57:03.817377] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:36.928 [2024-05-14 11:57:03.825569] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc5a5e0 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.928 11:57:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.187 11:57:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:37.187 "name": "raid_bdev1", 00:20:37.187 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:37.187 "strip_size_kb": 0, 00:20:37.187 "state": "online", 00:20:37.187 "raid_level": "raid1", 00:20:37.187 "superblock": true, 00:20:37.187 "num_base_bdevs": 2, 00:20:37.187 "num_base_bdevs_discovered": 1, 00:20:37.187 "num_base_bdevs_operational": 1, 00:20:37.187 "base_bdevs_list": [ 00:20:37.187 { 00:20:37.187 "name": null, 00:20:37.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.187 "is_configured": false, 00:20:37.187 "data_offset": 2048, 00:20:37.187 "data_size": 63488 00:20:37.187 }, 00:20:37.187 { 00:20:37.187 "name": "BaseBdev2", 00:20:37.187 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:37.187 "is_configured": true, 00:20:37.187 "data_offset": 2048, 00:20:37.187 "data_size": 63488 00:20:37.187 } 00:20:37.187 ] 00:20:37.187 }' 00:20:37.187 11:57:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:37.187 11:57:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:37.815 11:57:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:37.815 [2024-05-14 11:57:04.851631] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:37.815 11:57:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:20:38.074 [2024-05-14 11:57:04.902312] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x963090 00:20:38.074 [2024-05-14 11:57:04.904892] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.074 [2024-05-14 11:57:05.014641] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.074 [2024-05-14 11:57:05.015101] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.074 [2024-05-14 11:57:05.158870] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:38.641 [2024-05-14 11:57:05.499926] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.899 11:57:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.157 [2024-05-14 11:57:05.987078] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:39.157 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:39.157 "name": "raid_bdev1", 00:20:39.157 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:39.157 "strip_size_kb": 0, 00:20:39.157 "state": "online", 00:20:39.157 "raid_level": "raid1", 00:20:39.157 "superblock": true, 00:20:39.157 "num_base_bdevs": 2, 00:20:39.157 "num_base_bdevs_discovered": 2, 00:20:39.157 "num_base_bdevs_operational": 2, 00:20:39.157 "process": { 00:20:39.157 "type": "rebuild", 00:20:39.157 "target": "spare", 00:20:39.157 "progress": { 00:20:39.158 "blocks": 14336, 00:20:39.158 "percent": 22 00:20:39.158 } 00:20:39.158 }, 00:20:39.158 "base_bdevs_list": [ 00:20:39.158 { 00:20:39.158 "name": "spare", 00:20:39.158 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:39.158 "is_configured": true, 00:20:39.158 "data_offset": 2048, 00:20:39.158 "data_size": 63488 00:20:39.158 }, 00:20:39.158 { 00:20:39.158 "name": "BaseBdev2", 00:20:39.158 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:39.158 "is_configured": true, 00:20:39.158 "data_offset": 2048, 00:20:39.158 "data_size": 63488 00:20:39.158 } 00:20:39.158 ] 00:20:39.158 }' 00:20:39.158 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:39.158 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.158 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:39.158 [2024-05-14 11:57:06.216020] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:39.158 [2024-05-14 11:57:06.216282] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:39.158 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.158 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:39.416 [2024-05-14 11:57:06.464618] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:39.675 [2024-05-14 11:57:06.582655] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:39.675 [2024-05-14 11:57:06.582986] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:39.675 [2024-05-14 11:57:06.583797] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:39.675 [2024-05-14 11:57:06.585440] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.675 [2024-05-14 11:57:06.608377] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc5a5e0 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.675 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.933 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:39.933 "name": "raid_bdev1", 00:20:39.933 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:39.933 "strip_size_kb": 0, 00:20:39.933 "state": "online", 00:20:39.933 "raid_level": "raid1", 00:20:39.933 "superblock": true, 00:20:39.933 "num_base_bdevs": 2, 00:20:39.933 "num_base_bdevs_discovered": 1, 00:20:39.933 "num_base_bdevs_operational": 1, 00:20:39.933 "base_bdevs_list": [ 00:20:39.933 { 00:20:39.933 "name": null, 00:20:39.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.933 "is_configured": false, 00:20:39.933 "data_offset": 2048, 00:20:39.933 "data_size": 63488 00:20:39.933 }, 00:20:39.933 { 00:20:39.933 "name": "BaseBdev2", 00:20:39.933 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:39.933 "is_configured": true, 00:20:39.933 "data_offset": 2048, 00:20:39.933 "data_size": 63488 00:20:39.933 } 00:20:39.933 ] 00:20:39.933 }' 00:20:39.933 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:39.933 11:57:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.499 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.757 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:40.757 "name": "raid_bdev1", 00:20:40.757 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:40.757 "strip_size_kb": 0, 00:20:40.757 "state": "online", 00:20:40.757 "raid_level": "raid1", 00:20:40.757 "superblock": true, 00:20:40.757 "num_base_bdevs": 2, 00:20:40.757 "num_base_bdevs_discovered": 1, 00:20:40.757 "num_base_bdevs_operational": 1, 00:20:40.757 "base_bdevs_list": [ 00:20:40.757 { 00:20:40.757 "name": null, 00:20:40.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.757 "is_configured": false, 00:20:40.757 "data_offset": 2048, 00:20:40.757 "data_size": 63488 00:20:40.757 }, 00:20:40.757 { 00:20:40.757 "name": "BaseBdev2", 00:20:40.757 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:40.757 "is_configured": true, 00:20:40.757 "data_offset": 2048, 00:20:40.757 "data_size": 63488 00:20:40.757 } 00:20:40.757 ] 00:20:40.757 }' 00:20:40.757 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:40.757 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:40.757 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:41.015 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:41.015 11:57:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:41.015 [2024-05-14 11:57:08.074062] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:41.273 11:57:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:20:41.273 [2024-05-14 11:57:08.141612] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc532f0 00:20:41.273 [2024-05-14 11:57:08.143115] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:41.273 [2024-05-14 11:57:08.261844] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:41.273 [2024-05-14 11:57:08.262353] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:41.530 [2024-05-14 11:57:08.398523] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:41.530 [2024-05-14 11:57:08.398708] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:41.788 [2024-05-14 11:57:08.737878] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:42.046 [2024-05-14 11:57:08.958236] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:42.046 [2024-05-14 11:57:08.958493] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.047 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.304 [2024-05-14 11:57:09.318102] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:42.304 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:42.304 "name": "raid_bdev1", 00:20:42.304 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:42.304 "strip_size_kb": 0, 00:20:42.304 "state": "online", 00:20:42.304 "raid_level": "raid1", 00:20:42.304 "superblock": true, 00:20:42.304 "num_base_bdevs": 2, 00:20:42.304 "num_base_bdevs_discovered": 2, 00:20:42.304 "num_base_bdevs_operational": 2, 00:20:42.304 "process": { 00:20:42.304 "type": "rebuild", 00:20:42.304 "target": "spare", 00:20:42.304 "progress": { 00:20:42.304 "blocks": 14336, 00:20:42.304 "percent": 22 00:20:42.304 } 00:20:42.304 }, 00:20:42.304 "base_bdevs_list": [ 00:20:42.304 { 00:20:42.304 "name": "spare", 00:20:42.304 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:42.304 "is_configured": true, 00:20:42.304 "data_offset": 2048, 00:20:42.304 "data_size": 63488 00:20:42.304 }, 00:20:42.304 { 00:20:42.304 "name": "BaseBdev2", 00:20:42.304 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:42.304 "is_configured": true, 00:20:42.304 "data_offset": 2048, 00:20:42.304 "data_size": 63488 00:20:42.304 } 00:20:42.304 ] 00:20:42.304 }' 00:20:42.304 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:20:42.563 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # local timeout=685 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.563 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:42.821 "name": "raid_bdev1", 00:20:42.821 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:42.821 "strip_size_kb": 0, 00:20:42.821 "state": "online", 00:20:42.821 "raid_level": "raid1", 00:20:42.821 "superblock": true, 00:20:42.821 "num_base_bdevs": 2, 00:20:42.821 "num_base_bdevs_discovered": 2, 00:20:42.821 "num_base_bdevs_operational": 2, 00:20:42.821 "process": { 00:20:42.821 "type": "rebuild", 00:20:42.821 "target": "spare", 00:20:42.821 "progress": { 00:20:42.821 "blocks": 16384, 00:20:42.821 "percent": 25 00:20:42.821 } 00:20:42.821 }, 00:20:42.821 "base_bdevs_list": [ 00:20:42.821 { 00:20:42.821 "name": "spare", 00:20:42.821 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:42.821 "is_configured": true, 00:20:42.821 "data_offset": 2048, 00:20:42.821 "data_size": 63488 00:20:42.821 }, 00:20:42.821 { 00:20:42.821 "name": "BaseBdev2", 00:20:42.821 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:42.821 "is_configured": true, 00:20:42.821 "data_offset": 2048, 00:20:42.821 "data_size": 63488 00:20:42.821 } 00:20:42.821 ] 00:20:42.821 }' 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.821 11:57:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:42.821 [2024-05-14 11:57:09.820871] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:43.387 [2024-05-14 11:57:10.301755] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:20:43.646 [2024-05-14 11:57:10.520053] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:43.646 [2024-05-14 11:57:10.520265] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.904 11:57:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.162 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:44.162 "name": "raid_bdev1", 00:20:44.162 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:44.162 "strip_size_kb": 0, 00:20:44.162 "state": "online", 00:20:44.162 "raid_level": "raid1", 00:20:44.162 "superblock": true, 00:20:44.162 "num_base_bdevs": 2, 00:20:44.162 "num_base_bdevs_discovered": 2, 00:20:44.162 "num_base_bdevs_operational": 2, 00:20:44.162 "process": { 00:20:44.162 "type": "rebuild", 00:20:44.162 "target": "spare", 00:20:44.162 "progress": { 00:20:44.162 "blocks": 32768, 00:20:44.162 "percent": 51 00:20:44.162 } 00:20:44.162 }, 00:20:44.162 "base_bdevs_list": [ 00:20:44.162 { 00:20:44.162 "name": "spare", 00:20:44.162 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:44.162 "is_configured": true, 00:20:44.162 "data_offset": 2048, 00:20:44.162 "data_size": 63488 00:20:44.162 }, 00:20:44.162 { 00:20:44.162 "name": "BaseBdev2", 00:20:44.162 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:44.162 "is_configured": true, 00:20:44.162 "data_offset": 2048, 00:20:44.163 "data_size": 63488 00:20:44.163 } 00:20:44.163 ] 00:20:44.163 }' 00:20:44.163 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:44.163 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:44.163 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:44.163 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:44.163 11:57:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:44.163 [2024-05-14 11:57:11.233134] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:44.729 [2024-05-14 11:57:11.692012] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:45.295 "name": "raid_bdev1", 00:20:45.295 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:45.295 "strip_size_kb": 0, 00:20:45.295 "state": "online", 00:20:45.295 "raid_level": "raid1", 00:20:45.295 "superblock": true, 00:20:45.295 "num_base_bdevs": 2, 00:20:45.295 "num_base_bdevs_discovered": 2, 00:20:45.295 "num_base_bdevs_operational": 2, 00:20:45.295 "process": { 00:20:45.295 "type": "rebuild", 00:20:45.295 "target": "spare", 00:20:45.295 "progress": { 00:20:45.295 "blocks": 53248, 00:20:45.295 "percent": 83 00:20:45.295 } 00:20:45.295 }, 00:20:45.295 "base_bdevs_list": [ 00:20:45.295 { 00:20:45.295 "name": "spare", 00:20:45.295 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:45.295 "is_configured": true, 00:20:45.295 "data_offset": 2048, 00:20:45.295 "data_size": 63488 00:20:45.295 }, 00:20:45.295 { 00:20:45.295 "name": "BaseBdev2", 00:20:45.295 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:45.295 "is_configured": true, 00:20:45.295 "data_offset": 2048, 00:20:45.295 "data_size": 63488 00:20:45.295 } 00:20:45.295 ] 00:20:45.295 }' 00:20:45.295 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:45.552 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:45.552 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:45.552 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:45.552 11:57:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:20:45.552 [2024-05-14 11:57:12.479945] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:20:45.809 [2024-05-14 11:57:12.819705] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:46.066 [2024-05-14 11:57:12.920006] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:46.066 [2024-05-14 11:57:12.921786] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.631 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:46.888 "name": "raid_bdev1", 00:20:46.888 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:46.888 "strip_size_kb": 0, 00:20:46.888 "state": "online", 00:20:46.888 "raid_level": "raid1", 00:20:46.888 "superblock": true, 00:20:46.888 "num_base_bdevs": 2, 00:20:46.888 "num_base_bdevs_discovered": 2, 00:20:46.888 "num_base_bdevs_operational": 2, 00:20:46.888 "base_bdevs_list": [ 00:20:46.888 { 00:20:46.888 "name": "spare", 00:20:46.888 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:46.888 "is_configured": true, 00:20:46.888 "data_offset": 2048, 00:20:46.888 "data_size": 63488 00:20:46.888 }, 00:20:46.888 { 00:20:46.888 "name": "BaseBdev2", 00:20:46.888 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:46.888 "is_configured": true, 00:20:46.888 "data_offset": 2048, 00:20:46.888 "data_size": 63488 00:20:46.888 } 00:20:46.888 ] 00:20:46.888 }' 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # break 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.888 11:57:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:47.146 "name": "raid_bdev1", 00:20:47.146 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:47.146 "strip_size_kb": 0, 00:20:47.146 "state": "online", 00:20:47.146 "raid_level": "raid1", 00:20:47.146 "superblock": true, 00:20:47.146 "num_base_bdevs": 2, 00:20:47.146 "num_base_bdevs_discovered": 2, 00:20:47.146 "num_base_bdevs_operational": 2, 00:20:47.146 "base_bdevs_list": [ 00:20:47.146 { 00:20:47.146 "name": "spare", 00:20:47.146 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:47.146 "is_configured": true, 00:20:47.146 "data_offset": 2048, 00:20:47.146 "data_size": 63488 00:20:47.146 }, 00:20:47.146 { 00:20:47.146 "name": "BaseBdev2", 00:20:47.146 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:47.146 "is_configured": true, 00:20:47.146 "data_offset": 2048, 00:20:47.146 "data_size": 63488 00:20:47.146 } 00:20:47.146 ] 00:20:47.146 }' 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.146 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.405 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:47.405 "name": "raid_bdev1", 00:20:47.405 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:47.405 "strip_size_kb": 0, 00:20:47.405 "state": "online", 00:20:47.405 "raid_level": "raid1", 00:20:47.405 "superblock": true, 00:20:47.405 "num_base_bdevs": 2, 00:20:47.405 "num_base_bdevs_discovered": 2, 00:20:47.405 "num_base_bdevs_operational": 2, 00:20:47.405 "base_bdevs_list": [ 00:20:47.405 { 00:20:47.405 "name": "spare", 00:20:47.405 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:47.405 "is_configured": true, 00:20:47.405 "data_offset": 2048, 00:20:47.405 "data_size": 63488 00:20:47.405 }, 00:20:47.405 { 00:20:47.405 "name": "BaseBdev2", 00:20:47.405 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:47.405 "is_configured": true, 00:20:47.405 "data_offset": 2048, 00:20:47.405 "data_size": 63488 00:20:47.405 } 00:20:47.405 ] 00:20:47.405 }' 00:20:47.405 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:47.405 11:57:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:47.972 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:48.230 [2024-05-14 11:57:15.249579] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:48.230 [2024-05-14 11:57:15.249615] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:48.487 00:20:48.487 Latency(us) 00:20:48.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:48.487 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:48.487 raid_bdev1 : 11.60 89.15 267.44 0.00 0.00 15603.39 293.84 118534.68 00:20:48.487 =================================================================================================================== 00:20:48.487 Total : 89.15 267.44 0.00 0.00 15603.39 293.84 118534.68 00:20:48.487 [2024-05-14 11:57:15.345775] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.487 [2024-05-14 11:57:15.345806] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:48.487 [2024-05-14 11:57:15.345881] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:48.487 [2024-05-14 11:57:15.345893] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc56a50 name raid_bdev1, state offline 00:20:48.488 0 00:20:48.488 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.488 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # jq length 00:20:48.745 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:20:48.745 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:20:48.745 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:20:48.745 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:48.746 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:49.004 /dev/nbd0 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:49.004 1+0 records in 00:20:49.004 1+0 records out 00:20:49.004 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000203017 s, 20.2 MB/s 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev2 ']' 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:49.004 11:57:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:20:49.004 /dev/nbd1 00:20:49.004 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:49.262 1+0 records in 00:20:49.262 1+0 records out 00:20:49.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266432 s, 15.4 MB/s 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:49.262 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:49.520 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:20:49.779 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:50.037 11:57:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:50.295 [2024-05-14 11:57:17.197028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:50.295 [2024-05-14 11:57:17.197080] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.295 [2024-05-14 11:57:17.197099] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc53090 00:20:50.295 [2024-05-14 11:57:17.197112] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.295 [2024-05-14 11:57:17.198722] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.295 [2024-05-14 11:57:17.198752] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:50.295 [2024-05-14 11:57:17.198820] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:50.295 [2024-05-14 11:57:17.198849] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:50.295 BaseBdev1 00:20:50.295 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:20:50.295 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:20:50.295 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:20:50.553 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:50.811 [2024-05-14 11:57:17.690384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:50.811 [2024-05-14 11:57:17.690429] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.811 [2024-05-14 11:57:17.690448] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc6c510 00:20:50.811 [2024-05-14 11:57:17.690460] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.811 [2024-05-14 11:57:17.690784] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.811 [2024-05-14 11:57:17.690801] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:50.811 [2024-05-14 11:57:17.690863] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:20:50.811 [2024-05-14 11:57:17.690874] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:20:50.811 [2024-05-14 11:57:17.690885] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:50.811 [2024-05-14 11:57:17.690900] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc56180 name raid_bdev1, state configuring 00:20:50.811 [2024-05-14 11:57:17.690931] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:50.811 BaseBdev2 00:20:50.811 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:51.070 11:57:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:51.327 [2024-05-14 11:57:18.175927] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:51.327 [2024-05-14 11:57:18.175966] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.327 [2024-05-14 11:57:18.175988] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc566c0 00:20:51.327 [2024-05-14 11:57:18.176000] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.327 [2024-05-14 11:57:18.176357] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.327 [2024-05-14 11:57:18.176374] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:51.327 [2024-05-14 11:57:18.176457] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:51.327 [2024-05-14 11:57:18.176484] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:51.327 spare 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.327 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.327 [2024-05-14 11:57:18.276812] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc59810 00:20:51.327 [2024-05-14 11:57:18.276828] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:51.328 [2024-05-14 11:57:18.277005] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc589b0 00:20:51.328 [2024-05-14 11:57:18.277151] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc59810 00:20:51.328 [2024-05-14 11:57:18.277161] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc59810 00:20:51.328 [2024-05-14 11:57:18.277268] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:51.585 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:51.585 "name": "raid_bdev1", 00:20:51.585 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:51.585 "strip_size_kb": 0, 00:20:51.585 "state": "online", 00:20:51.585 "raid_level": "raid1", 00:20:51.585 "superblock": true, 00:20:51.585 "num_base_bdevs": 2, 00:20:51.585 "num_base_bdevs_discovered": 2, 00:20:51.585 "num_base_bdevs_operational": 2, 00:20:51.585 "base_bdevs_list": [ 00:20:51.585 { 00:20:51.585 "name": "spare", 00:20:51.585 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:51.585 "is_configured": true, 00:20:51.585 "data_offset": 2048, 00:20:51.585 "data_size": 63488 00:20:51.585 }, 00:20:51.585 { 00:20:51.585 "name": "BaseBdev2", 00:20:51.585 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:51.585 "is_configured": true, 00:20:51.585 "data_offset": 2048, 00:20:51.585 "data_size": 63488 00:20:51.585 } 00:20:51.585 ] 00:20:51.585 }' 00:20:51.585 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:51.585 11:57:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.153 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:52.411 "name": "raid_bdev1", 00:20:52.411 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:52.411 "strip_size_kb": 0, 00:20:52.411 "state": "online", 00:20:52.411 "raid_level": "raid1", 00:20:52.411 "superblock": true, 00:20:52.411 "num_base_bdevs": 2, 00:20:52.411 "num_base_bdevs_discovered": 2, 00:20:52.411 "num_base_bdevs_operational": 2, 00:20:52.411 "base_bdevs_list": [ 00:20:52.411 { 00:20:52.411 "name": "spare", 00:20:52.411 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:52.411 "is_configured": true, 00:20:52.411 "data_offset": 2048, 00:20:52.411 "data_size": 63488 00:20:52.411 }, 00:20:52.411 { 00:20:52.411 "name": "BaseBdev2", 00:20:52.411 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:52.411 "is_configured": true, 00:20:52.411 "data_offset": 2048, 00:20:52.411 "data_size": 63488 00:20:52.411 } 00:20:52.411 ] 00:20:52.411 }' 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:52.411 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.670 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:20:52.670 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:52.931 [2024-05-14 11:57:19.848711] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.931 11:57:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.190 11:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:53.190 "name": "raid_bdev1", 00:20:53.190 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:53.190 "strip_size_kb": 0, 00:20:53.190 "state": "online", 00:20:53.190 "raid_level": "raid1", 00:20:53.190 "superblock": true, 00:20:53.190 "num_base_bdevs": 2, 00:20:53.190 "num_base_bdevs_discovered": 1, 00:20:53.190 "num_base_bdevs_operational": 1, 00:20:53.190 "base_bdevs_list": [ 00:20:53.190 { 00:20:53.190 "name": null, 00:20:53.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.190 "is_configured": false, 00:20:53.190 "data_offset": 2048, 00:20:53.190 "data_size": 63488 00:20:53.190 }, 00:20:53.190 { 00:20:53.190 "name": "BaseBdev2", 00:20:53.190 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:53.190 "is_configured": true, 00:20:53.190 "data_offset": 2048, 00:20:53.190 "data_size": 63488 00:20:53.190 } 00:20:53.190 ] 00:20:53.190 }' 00:20:53.190 11:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:53.190 11:57:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:53.757 11:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:54.016 [2024-05-14 11:57:20.931743] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:54.016 [2024-05-14 11:57:20.931906] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:54.016 [2024-05-14 11:57:20.931928] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:54.016 [2024-05-14 11:57:20.931958] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:54.016 [2024-05-14 11:57:20.937238] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc6e1d0 00:20:54.016 [2024-05-14 11:57:20.939339] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:54.016 11:57:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # sleep 1 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.015 11:57:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:55.274 "name": "raid_bdev1", 00:20:55.274 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:55.274 "strip_size_kb": 0, 00:20:55.274 "state": "online", 00:20:55.274 "raid_level": "raid1", 00:20:55.274 "superblock": true, 00:20:55.274 "num_base_bdevs": 2, 00:20:55.274 "num_base_bdevs_discovered": 2, 00:20:55.274 "num_base_bdevs_operational": 2, 00:20:55.274 "process": { 00:20:55.274 "type": "rebuild", 00:20:55.274 "target": "spare", 00:20:55.274 "progress": { 00:20:55.274 "blocks": 24576, 00:20:55.274 "percent": 38 00:20:55.274 } 00:20:55.274 }, 00:20:55.274 "base_bdevs_list": [ 00:20:55.274 { 00:20:55.274 "name": "spare", 00:20:55.274 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:55.274 "is_configured": true, 00:20:55.274 "data_offset": 2048, 00:20:55.274 "data_size": 63488 00:20:55.274 }, 00:20:55.274 { 00:20:55.274 "name": "BaseBdev2", 00:20:55.274 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:55.274 "is_configured": true, 00:20:55.274 "data_offset": 2048, 00:20:55.274 "data_size": 63488 00:20:55.274 } 00:20:55.274 ] 00:20:55.274 }' 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:55.274 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:55.532 [2024-05-14 11:57:22.526248] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.532 [2024-05-14 11:57:22.552203] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:55.532 [2024-05-14 11:57:22.552247] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.532 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.790 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:55.790 "name": "raid_bdev1", 00:20:55.790 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:55.790 "strip_size_kb": 0, 00:20:55.790 "state": "online", 00:20:55.790 "raid_level": "raid1", 00:20:55.790 "superblock": true, 00:20:55.790 "num_base_bdevs": 2, 00:20:55.790 "num_base_bdevs_discovered": 1, 00:20:55.790 "num_base_bdevs_operational": 1, 00:20:55.790 "base_bdevs_list": [ 00:20:55.790 { 00:20:55.790 "name": null, 00:20:55.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.790 "is_configured": false, 00:20:55.790 "data_offset": 2048, 00:20:55.790 "data_size": 63488 00:20:55.790 }, 00:20:55.790 { 00:20:55.790 "name": "BaseBdev2", 00:20:55.790 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:55.790 "is_configured": true, 00:20:55.790 "data_offset": 2048, 00:20:55.790 "data_size": 63488 00:20:55.790 } 00:20:55.790 ] 00:20:55.790 }' 00:20:55.790 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:55.790 11:57:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:56.357 11:57:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:56.616 [2024-05-14 11:57:23.652659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:56.616 [2024-05-14 11:57:23.652713] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.616 [2024-05-14 11:57:23.652735] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc6d590 00:20:56.616 [2024-05-14 11:57:23.652748] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.616 [2024-05-14 11:57:23.653133] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.616 [2024-05-14 11:57:23.653151] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:56.616 [2024-05-14 11:57:23.653237] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:20:56.616 [2024-05-14 11:57:23.653250] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:56.616 [2024-05-14 11:57:23.653260] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:56.616 [2024-05-14 11:57:23.653279] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:56.616 [2024-05-14 11:57:23.658586] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x963090 00:20:56.616 spare 00:20:56.616 [2024-05-14 11:57:23.660049] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:56.616 11:57:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # sleep 1 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:57.993 "name": "raid_bdev1", 00:20:57.993 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:57.993 "strip_size_kb": 0, 00:20:57.993 "state": "online", 00:20:57.993 "raid_level": "raid1", 00:20:57.993 "superblock": true, 00:20:57.993 "num_base_bdevs": 2, 00:20:57.993 "num_base_bdevs_discovered": 2, 00:20:57.993 "num_base_bdevs_operational": 2, 00:20:57.993 "process": { 00:20:57.993 "type": "rebuild", 00:20:57.993 "target": "spare", 00:20:57.993 "progress": { 00:20:57.993 "blocks": 24576, 00:20:57.993 "percent": 38 00:20:57.993 } 00:20:57.993 }, 00:20:57.993 "base_bdevs_list": [ 00:20:57.993 { 00:20:57.993 "name": "spare", 00:20:57.993 "uuid": "125ccfe8-f724-524d-9403-f45c438b5099", 00:20:57.993 "is_configured": true, 00:20:57.993 "data_offset": 2048, 00:20:57.993 "data_size": 63488 00:20:57.993 }, 00:20:57.993 { 00:20:57.993 "name": "BaseBdev2", 00:20:57.993 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:57.993 "is_configured": true, 00:20:57.993 "data_offset": 2048, 00:20:57.993 "data_size": 63488 00:20:57.993 } 00:20:57.993 ] 00:20:57.993 }' 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:57.993 11:57:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:57.993 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:57.993 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:58.253 [2024-05-14 11:57:25.236394] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:58.253 [2024-05-14 11:57:25.272890] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:58.253 [2024-05-14 11:57:25.272936] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.253 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.512 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:58.512 "name": "raid_bdev1", 00:20:58.512 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:58.512 "strip_size_kb": 0, 00:20:58.512 "state": "online", 00:20:58.512 "raid_level": "raid1", 00:20:58.512 "superblock": true, 00:20:58.512 "num_base_bdevs": 2, 00:20:58.512 "num_base_bdevs_discovered": 1, 00:20:58.512 "num_base_bdevs_operational": 1, 00:20:58.512 "base_bdevs_list": [ 00:20:58.512 { 00:20:58.512 "name": null, 00:20:58.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.512 "is_configured": false, 00:20:58.512 "data_offset": 2048, 00:20:58.512 "data_size": 63488 00:20:58.512 }, 00:20:58.512 { 00:20:58.512 "name": "BaseBdev2", 00:20:58.512 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:58.512 "is_configured": true, 00:20:58.512 "data_offset": 2048, 00:20:58.512 "data_size": 63488 00:20:58.512 } 00:20:58.512 ] 00:20:58.512 }' 00:20:58.512 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:58.512 11:57:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.080 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.339 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:59.339 "name": "raid_bdev1", 00:20:59.339 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:20:59.339 "strip_size_kb": 0, 00:20:59.339 "state": "online", 00:20:59.339 "raid_level": "raid1", 00:20:59.339 "superblock": true, 00:20:59.339 "num_base_bdevs": 2, 00:20:59.339 "num_base_bdevs_discovered": 1, 00:20:59.339 "num_base_bdevs_operational": 1, 00:20:59.339 "base_bdevs_list": [ 00:20:59.339 { 00:20:59.339 "name": null, 00:20:59.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.339 "is_configured": false, 00:20:59.339 "data_offset": 2048, 00:20:59.339 "data_size": 63488 00:20:59.339 }, 00:20:59.339 { 00:20:59.339 "name": "BaseBdev2", 00:20:59.339 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:20:59.339 "is_configured": true, 00:20:59.339 "data_offset": 2048, 00:20:59.339 "data_size": 63488 00:20:59.339 } 00:20:59.339 ] 00:20:59.339 }' 00:20:59.339 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:59.598 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:59.598 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:59.598 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:59.598 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:59.856 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:59.856 [2024-05-14 11:57:26.942195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:59.856 [2024-05-14 11:57:26.942252] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:59.856 [2024-05-14 11:57:26.942275] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc52290 00:20:59.856 [2024-05-14 11:57:26.942291] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.114 [2024-05-14 11:57:26.942695] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.114 [2024-05-14 11:57:26.942717] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:00.114 [2024-05-14 11:57:26.942804] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:00.114 [2024-05-14 11:57:26.942819] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:00.114 [2024-05-14 11:57:26.942830] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:00.114 BaseBdev1 00:21:00.114 11:57:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@786 -- # sleep 1 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.046 11:57:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.303 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:01.303 "name": "raid_bdev1", 00:21:01.303 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:21:01.303 "strip_size_kb": 0, 00:21:01.303 "state": "online", 00:21:01.303 "raid_level": "raid1", 00:21:01.303 "superblock": true, 00:21:01.303 "num_base_bdevs": 2, 00:21:01.303 "num_base_bdevs_discovered": 1, 00:21:01.303 "num_base_bdevs_operational": 1, 00:21:01.303 "base_bdevs_list": [ 00:21:01.303 { 00:21:01.303 "name": null, 00:21:01.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.304 "is_configured": false, 00:21:01.304 "data_offset": 2048, 00:21:01.304 "data_size": 63488 00:21:01.304 }, 00:21:01.304 { 00:21:01.304 "name": "BaseBdev2", 00:21:01.304 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:21:01.304 "is_configured": true, 00:21:01.304 "data_offset": 2048, 00:21:01.304 "data_size": 63488 00:21:01.304 } 00:21:01.304 ] 00:21:01.304 }' 00:21:01.304 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:01.304 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.869 11:57:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:02.128 "name": "raid_bdev1", 00:21:02.128 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:21:02.128 "strip_size_kb": 0, 00:21:02.128 "state": "online", 00:21:02.128 "raid_level": "raid1", 00:21:02.128 "superblock": true, 00:21:02.128 "num_base_bdevs": 2, 00:21:02.128 "num_base_bdevs_discovered": 1, 00:21:02.128 "num_base_bdevs_operational": 1, 00:21:02.128 "base_bdevs_list": [ 00:21:02.128 { 00:21:02.128 "name": null, 00:21:02.128 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.128 "is_configured": false, 00:21:02.128 "data_offset": 2048, 00:21:02.128 "data_size": 63488 00:21:02.128 }, 00:21:02.128 { 00:21:02.128 "name": "BaseBdev2", 00:21:02.128 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:21:02.128 "is_configured": true, 00:21:02.128 "data_offset": 2048, 00:21:02.128 "data_size": 63488 00:21:02.128 } 00:21:02.128 ] 00:21:02.128 }' 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:02.128 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:02.386 [2024-05-14 11:57:29.356911] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:02.386 [2024-05-14 11:57:29.357046] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:02.386 [2024-05-14 11:57:29.357062] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:02.386 request: 00:21:02.386 { 00:21:02.386 "raid_bdev": "raid_bdev1", 00:21:02.386 "base_bdev": "BaseBdev1", 00:21:02.386 "method": "bdev_raid_add_base_bdev", 00:21:02.386 "req_id": 1 00:21:02.386 } 00:21:02.386 Got JSON-RPC error response 00:21:02.386 response: 00:21:02.386 { 00:21:02.386 "code": -22, 00:21:02.386 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:02.386 } 00:21:02.386 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:02.386 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:02.386 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:02.386 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:02.386 11:57:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # sleep 1 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.320 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.579 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:03.579 "name": "raid_bdev1", 00:21:03.579 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:21:03.579 "strip_size_kb": 0, 00:21:03.579 "state": "online", 00:21:03.579 "raid_level": "raid1", 00:21:03.579 "superblock": true, 00:21:03.579 "num_base_bdevs": 2, 00:21:03.579 "num_base_bdevs_discovered": 1, 00:21:03.579 "num_base_bdevs_operational": 1, 00:21:03.579 "base_bdevs_list": [ 00:21:03.579 { 00:21:03.579 "name": null, 00:21:03.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.579 "is_configured": false, 00:21:03.579 "data_offset": 2048, 00:21:03.579 "data_size": 63488 00:21:03.579 }, 00:21:03.579 { 00:21:03.579 "name": "BaseBdev2", 00:21:03.579 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:21:03.579 "is_configured": true, 00:21:03.579 "data_offset": 2048, 00:21:03.579 "data_size": 63488 00:21:03.579 } 00:21:03.579 ] 00:21:03.579 }' 00:21:03.579 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:03.579 11:57:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.146 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:04.404 "name": "raid_bdev1", 00:21:04.404 "uuid": "f26e234b-6b3c-4a30-840b-c9c0acb7bf18", 00:21:04.404 "strip_size_kb": 0, 00:21:04.404 "state": "online", 00:21:04.404 "raid_level": "raid1", 00:21:04.404 "superblock": true, 00:21:04.404 "num_base_bdevs": 2, 00:21:04.404 "num_base_bdevs_discovered": 1, 00:21:04.404 "num_base_bdevs_operational": 1, 00:21:04.404 "base_bdevs_list": [ 00:21:04.404 { 00:21:04.404 "name": null, 00:21:04.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.404 "is_configured": false, 00:21:04.404 "data_offset": 2048, 00:21:04.404 "data_size": 63488 00:21:04.404 }, 00:21:04.404 { 00:21:04.404 "name": "BaseBdev2", 00:21:04.404 "uuid": "1430add4-33ad-55c1-bd36-d45ce655a567", 00:21:04.404 "is_configured": true, 00:21:04.404 "data_offset": 2048, 00:21:04.404 "data_size": 63488 00:21:04.404 } 00:21:04.404 ] 00:21:04.404 }' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # killprocess 1760839 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 1760839 ']' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 1760839 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1760839 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:04.404 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1760839' 00:21:04.662 killing process with pid 1760839 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 1760839 00:21:04.662 Received shutdown signal, test time was about 27.708813 seconds 00:21:04.662 00:21:04.662 Latency(us) 00:21:04.662 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:04.662 =================================================================================================================== 00:21:04.662 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:04.662 [2024-05-14 11:57:31.490887] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:04.662 [2024-05-14 11:57:31.490995] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 1760839 00:21:04.662 [2024-05-14 11:57:31.491050] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.662 [2024-05-14 11:57:31.491063] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc59810 name raid_bdev1, state offline 00:21:04.662 [2024-05-14 11:57:31.512037] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@797 -- # return 0 00:21:04.662 00:21:04.662 real 0m32.159s 00:21:04.662 user 0m50.420s 00:21:04.662 sys 0m4.585s 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:04.662 11:57:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:04.662 ************************************ 00:21:04.662 END TEST raid_rebuild_test_sb_io 00:21:04.662 ************************************ 00:21:04.920 11:57:31 bdev_raid -- bdev/bdev_raid.sh@822 -- # for n in 2 4 00:21:04.920 11:57:31 bdev_raid -- bdev/bdev_raid.sh@823 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:21:04.920 11:57:31 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:21:04.920 11:57:31 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:04.920 11:57:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:04.920 ************************************ 00:21:04.920 START TEST raid_rebuild_test 00:21:04.920 ************************************ 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false false true 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local verify=true 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # local strip_size 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@582 -- # local create_arg 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local data_offset 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # raid_pid=1765959 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # waitforlisten 1765959 /var/tmp/spdk-raid.sock 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 1765959 ']' 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:04.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:04.920 11:57:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.920 [2024-05-14 11:57:31.879296] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:21:04.920 [2024-05-14 11:57:31.879357] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1765959 ] 00:21:04.920 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:04.920 Zero copy mechanism will not be used. 00:21:05.178 [2024-05-14 11:57:32.006329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:05.178 [2024-05-14 11:57:32.111623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:05.178 [2024-05-14 11:57:32.179143] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.178 [2024-05-14 11:57:32.179188] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.745 11:57:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:05.745 11:57:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:21:05.745 11:57:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:05.745 11:57:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:06.003 BaseBdev1_malloc 00:21:06.003 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:06.261 [2024-05-14 11:57:33.282457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:06.261 [2024-05-14 11:57:33.282507] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.261 [2024-05-14 11:57:33.282529] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1913960 00:21:06.261 [2024-05-14 11:57:33.282542] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.261 [2024-05-14 11:57:33.284323] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.261 [2024-05-14 11:57:33.284353] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:06.261 BaseBdev1 00:21:06.261 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:06.261 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:06.518 BaseBdev2_malloc 00:21:06.518 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:06.776 [2024-05-14 11:57:33.792722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:06.776 [2024-05-14 11:57:33.792776] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.776 [2024-05-14 11:57:33.792799] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac6b40 00:21:06.776 [2024-05-14 11:57:33.792811] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.776 [2024-05-14 11:57:33.794275] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.776 [2024-05-14 11:57:33.794303] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:06.776 BaseBdev2 00:21:06.776 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:06.776 11:57:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:07.033 BaseBdev3_malloc 00:21:07.033 11:57:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:07.291 [2024-05-14 11:57:34.296028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:07.291 [2024-05-14 11:57:34.296076] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.291 [2024-05-14 11:57:34.296100] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x190d4c0 00:21:07.291 [2024-05-14 11:57:34.296114] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.291 [2024-05-14 11:57:34.297709] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.291 [2024-05-14 11:57:34.297737] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:07.291 BaseBdev3 00:21:07.291 11:57:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:07.291 11:57:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:07.549 BaseBdev4_malloc 00:21:07.549 11:57:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:07.807 [2024-05-14 11:57:34.787192] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:07.807 [2024-05-14 11:57:34.787242] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.807 [2024-05-14 11:57:34.787262] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x190fec0 00:21:07.807 [2024-05-14 11:57:34.787275] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.807 [2024-05-14 11:57:34.788880] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.807 [2024-05-14 11:57:34.788908] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:07.807 BaseBdev4 00:21:07.807 11:57:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:08.065 spare_malloc 00:21:08.066 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:08.325 spare_delay 00:21:08.325 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:08.584 [2024-05-14 11:57:35.442820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:08.584 [2024-05-14 11:57:35.442867] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.584 [2024-05-14 11:57:35.442885] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1910ec0 00:21:08.584 [2024-05-14 11:57:35.442898] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.584 [2024-05-14 11:57:35.444510] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.584 [2024-05-14 11:57:35.444543] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:08.584 spare 00:21:08.584 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:08.843 [2024-05-14 11:57:35.679478] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:08.843 [2024-05-14 11:57:35.680802] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.843 [2024-05-14 11:57:35.680858] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.843 [2024-05-14 11:57:35.680904] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:08.843 [2024-05-14 11:57:35.680985] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x19129a0 00:21:08.843 [2024-05-14 11:57:35.680995] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:08.843 [2024-05-14 11:57:35.681204] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x190f990 00:21:08.843 [2024-05-14 11:57:35.681356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19129a0 00:21:08.843 [2024-05-14 11:57:35.681367] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19129a0 00:21:08.843 [2024-05-14 11:57:35.681495] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.843 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:09.101 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:09.101 "name": "raid_bdev1", 00:21:09.101 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:09.101 "strip_size_kb": 0, 00:21:09.101 "state": "online", 00:21:09.101 "raid_level": "raid1", 00:21:09.101 "superblock": false, 00:21:09.101 "num_base_bdevs": 4, 00:21:09.101 "num_base_bdevs_discovered": 4, 00:21:09.101 "num_base_bdevs_operational": 4, 00:21:09.101 "base_bdevs_list": [ 00:21:09.101 { 00:21:09.101 "name": "BaseBdev1", 00:21:09.101 "uuid": "e55e7873-2f9f-5f1d-b84c-bfa3c1ed1a1e", 00:21:09.101 "is_configured": true, 00:21:09.101 "data_offset": 0, 00:21:09.101 "data_size": 65536 00:21:09.101 }, 00:21:09.101 { 00:21:09.101 "name": "BaseBdev2", 00:21:09.101 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:09.101 "is_configured": true, 00:21:09.101 "data_offset": 0, 00:21:09.101 "data_size": 65536 00:21:09.101 }, 00:21:09.101 { 00:21:09.102 "name": "BaseBdev3", 00:21:09.102 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:09.102 "is_configured": true, 00:21:09.102 "data_offset": 0, 00:21:09.102 "data_size": 65536 00:21:09.102 }, 00:21:09.102 { 00:21:09.102 "name": "BaseBdev4", 00:21:09.102 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:09.102 "is_configured": true, 00:21:09.102 "data_offset": 0, 00:21:09.102 "data_size": 65536 00:21:09.102 } 00:21:09.102 ] 00:21:09.102 }' 00:21:09.102 11:57:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:09.102 11:57:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:09.669 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:09.669 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:21:09.669 [2024-05-14 11:57:36.690418] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:09.669 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:21:09.669 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.669 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:09.928 11:57:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:10.188 [2024-05-14 11:57:37.183490] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1915020 00:21:10.188 /dev/nbd0 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:10.188 1+0 records in 00:21:10.188 1+0 records out 00:21:10.188 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251215 s, 16.3 MB/s 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:21:10.188 11:57:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:18.369 65536+0 records in 00:21:18.369 65536+0 records out 00:21:18.369 33554432 bytes (34 MB, 32 MiB) copied, 6.82967 s, 4.9 MB/s 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:18.369 [2024-05-14 11:57:44.334075] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:18.369 [2024-05-14 11:57:44.505702] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:18.369 "name": "raid_bdev1", 00:21:18.369 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:18.369 "strip_size_kb": 0, 00:21:18.369 "state": "online", 00:21:18.369 "raid_level": "raid1", 00:21:18.369 "superblock": false, 00:21:18.369 "num_base_bdevs": 4, 00:21:18.369 "num_base_bdevs_discovered": 3, 00:21:18.369 "num_base_bdevs_operational": 3, 00:21:18.369 "base_bdevs_list": [ 00:21:18.369 { 00:21:18.369 "name": null, 00:21:18.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.369 "is_configured": false, 00:21:18.369 "data_offset": 0, 00:21:18.369 "data_size": 65536 00:21:18.369 }, 00:21:18.369 { 00:21:18.369 "name": "BaseBdev2", 00:21:18.369 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:18.369 "is_configured": true, 00:21:18.369 "data_offset": 0, 00:21:18.369 "data_size": 65536 00:21:18.369 }, 00:21:18.369 { 00:21:18.369 "name": "BaseBdev3", 00:21:18.369 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:18.369 "is_configured": true, 00:21:18.369 "data_offset": 0, 00:21:18.369 "data_size": 65536 00:21:18.369 }, 00:21:18.369 { 00:21:18.369 "name": "BaseBdev4", 00:21:18.369 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:18.369 "is_configured": true, 00:21:18.369 "data_offset": 0, 00:21:18.369 "data_size": 65536 00:21:18.369 } 00:21:18.369 ] 00:21:18.369 }' 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:18.369 11:57:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.369 11:57:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:18.627 [2024-05-14 11:57:45.580559] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:18.627 [2024-05-14 11:57:45.584670] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab38f0 00:21:18.627 [2024-05-14 11:57:45.587064] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:18.627 11:57:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # sleep 1 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.559 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.816 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:19.816 "name": "raid_bdev1", 00:21:19.816 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:19.816 "strip_size_kb": 0, 00:21:19.816 "state": "online", 00:21:19.816 "raid_level": "raid1", 00:21:19.816 "superblock": false, 00:21:19.816 "num_base_bdevs": 4, 00:21:19.816 "num_base_bdevs_discovered": 4, 00:21:19.816 "num_base_bdevs_operational": 4, 00:21:19.816 "process": { 00:21:19.816 "type": "rebuild", 00:21:19.816 "target": "spare", 00:21:19.816 "progress": { 00:21:19.816 "blocks": 24576, 00:21:19.816 "percent": 37 00:21:19.816 } 00:21:19.816 }, 00:21:19.816 "base_bdevs_list": [ 00:21:19.816 { 00:21:19.816 "name": "spare", 00:21:19.816 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:19.816 "is_configured": true, 00:21:19.816 "data_offset": 0, 00:21:19.816 "data_size": 65536 00:21:19.816 }, 00:21:19.816 { 00:21:19.816 "name": "BaseBdev2", 00:21:19.816 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:19.816 "is_configured": true, 00:21:19.816 "data_offset": 0, 00:21:19.816 "data_size": 65536 00:21:19.816 }, 00:21:19.816 { 00:21:19.816 "name": "BaseBdev3", 00:21:19.816 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:19.816 "is_configured": true, 00:21:19.816 "data_offset": 0, 00:21:19.816 "data_size": 65536 00:21:19.816 }, 00:21:19.816 { 00:21:19.816 "name": "BaseBdev4", 00:21:19.816 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:19.816 "is_configured": true, 00:21:19.816 "data_offset": 0, 00:21:19.816 "data_size": 65536 00:21:19.816 } 00:21:19.816 ] 00:21:19.816 }' 00:21:19.816 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:20.074 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:20.074 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:20.074 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.074 11:57:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:20.332 [2024-05-14 11:57:47.170303] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:20.332 [2024-05-14 11:57:47.199798] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:20.332 [2024-05-14 11:57:47.199842] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.332 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.590 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:20.590 "name": "raid_bdev1", 00:21:20.590 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:20.590 "strip_size_kb": 0, 00:21:20.590 "state": "online", 00:21:20.590 "raid_level": "raid1", 00:21:20.590 "superblock": false, 00:21:20.590 "num_base_bdevs": 4, 00:21:20.590 "num_base_bdevs_discovered": 3, 00:21:20.590 "num_base_bdevs_operational": 3, 00:21:20.590 "base_bdevs_list": [ 00:21:20.590 { 00:21:20.590 "name": null, 00:21:20.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.590 "is_configured": false, 00:21:20.590 "data_offset": 0, 00:21:20.590 "data_size": 65536 00:21:20.590 }, 00:21:20.590 { 00:21:20.590 "name": "BaseBdev2", 00:21:20.590 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:20.590 "is_configured": true, 00:21:20.590 "data_offset": 0, 00:21:20.590 "data_size": 65536 00:21:20.590 }, 00:21:20.590 { 00:21:20.590 "name": "BaseBdev3", 00:21:20.590 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:20.590 "is_configured": true, 00:21:20.590 "data_offset": 0, 00:21:20.590 "data_size": 65536 00:21:20.590 }, 00:21:20.590 { 00:21:20.590 "name": "BaseBdev4", 00:21:20.590 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:20.590 "is_configured": true, 00:21:20.590 "data_offset": 0, 00:21:20.590 "data_size": 65536 00:21:20.590 } 00:21:20.590 ] 00:21:20.590 }' 00:21:20.590 11:57:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:20.590 11:57:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.156 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:21.415 "name": "raid_bdev1", 00:21:21.415 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:21.415 "strip_size_kb": 0, 00:21:21.415 "state": "online", 00:21:21.415 "raid_level": "raid1", 00:21:21.415 "superblock": false, 00:21:21.415 "num_base_bdevs": 4, 00:21:21.415 "num_base_bdevs_discovered": 3, 00:21:21.415 "num_base_bdevs_operational": 3, 00:21:21.415 "base_bdevs_list": [ 00:21:21.415 { 00:21:21.415 "name": null, 00:21:21.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.415 "is_configured": false, 00:21:21.415 "data_offset": 0, 00:21:21.415 "data_size": 65536 00:21:21.415 }, 00:21:21.415 { 00:21:21.415 "name": "BaseBdev2", 00:21:21.415 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:21.415 "is_configured": true, 00:21:21.415 "data_offset": 0, 00:21:21.415 "data_size": 65536 00:21:21.415 }, 00:21:21.415 { 00:21:21.415 "name": "BaseBdev3", 00:21:21.415 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:21.415 "is_configured": true, 00:21:21.415 "data_offset": 0, 00:21:21.415 "data_size": 65536 00:21:21.415 }, 00:21:21.415 { 00:21:21.415 "name": "BaseBdev4", 00:21:21.415 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:21.415 "is_configured": true, 00:21:21.415 "data_offset": 0, 00:21:21.415 "data_size": 65536 00:21:21.415 } 00:21:21.415 ] 00:21:21.415 }' 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:21.415 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:21.673 [2024-05-14 11:57:48.631685] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:21.673 [2024-05-14 11:57:48.636303] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac5e80 00:21:21.673 [2024-05-14 11:57:48.637868] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:21.673 11:57:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # sleep 1 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.606 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.864 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:22.864 "name": "raid_bdev1", 00:21:22.864 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:22.864 "strip_size_kb": 0, 00:21:22.864 "state": "online", 00:21:22.864 "raid_level": "raid1", 00:21:22.864 "superblock": false, 00:21:22.864 "num_base_bdevs": 4, 00:21:22.864 "num_base_bdevs_discovered": 4, 00:21:22.864 "num_base_bdevs_operational": 4, 00:21:22.864 "process": { 00:21:22.864 "type": "rebuild", 00:21:22.864 "target": "spare", 00:21:22.864 "progress": { 00:21:22.864 "blocks": 24576, 00:21:22.864 "percent": 37 00:21:22.864 } 00:21:22.864 }, 00:21:22.864 "base_bdevs_list": [ 00:21:22.864 { 00:21:22.864 "name": "spare", 00:21:22.864 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:22.864 "is_configured": true, 00:21:22.864 "data_offset": 0, 00:21:22.864 "data_size": 65536 00:21:22.864 }, 00:21:22.864 { 00:21:22.864 "name": "BaseBdev2", 00:21:22.864 "uuid": "7fb15362-33ad-51c7-9796-16dcca847902", 00:21:22.864 "is_configured": true, 00:21:22.864 "data_offset": 0, 00:21:22.864 "data_size": 65536 00:21:22.864 }, 00:21:22.864 { 00:21:22.864 "name": "BaseBdev3", 00:21:22.864 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:22.864 "is_configured": true, 00:21:22.864 "data_offset": 0, 00:21:22.864 "data_size": 65536 00:21:22.864 }, 00:21:22.864 { 00:21:22.864 "name": "BaseBdev4", 00:21:22.864 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:22.864 "is_configured": true, 00:21:22.864 "data_offset": 0, 00:21:22.864 "data_size": 65536 00:21:22.864 } 00:21:22.864 ] 00:21:22.864 }' 00:21:22.864 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:22.864 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:22.864 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:21:23.122 11:57:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:23.380 [2024-05-14 11:57:50.217519] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:23.380 [2024-05-14 11:57:50.250627] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1ac5e80 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.381 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:23.639 "name": "raid_bdev1", 00:21:23.639 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:23.639 "strip_size_kb": 0, 00:21:23.639 "state": "online", 00:21:23.639 "raid_level": "raid1", 00:21:23.639 "superblock": false, 00:21:23.639 "num_base_bdevs": 4, 00:21:23.639 "num_base_bdevs_discovered": 3, 00:21:23.639 "num_base_bdevs_operational": 3, 00:21:23.639 "process": { 00:21:23.639 "type": "rebuild", 00:21:23.639 "target": "spare", 00:21:23.639 "progress": { 00:21:23.639 "blocks": 36864, 00:21:23.639 "percent": 56 00:21:23.639 } 00:21:23.639 }, 00:21:23.639 "base_bdevs_list": [ 00:21:23.639 { 00:21:23.639 "name": "spare", 00:21:23.639 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:23.639 "is_configured": true, 00:21:23.639 "data_offset": 0, 00:21:23.639 "data_size": 65536 00:21:23.639 }, 00:21:23.639 { 00:21:23.639 "name": null, 00:21:23.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.639 "is_configured": false, 00:21:23.639 "data_offset": 0, 00:21:23.639 "data_size": 65536 00:21:23.639 }, 00:21:23.639 { 00:21:23.639 "name": "BaseBdev3", 00:21:23.639 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:23.639 "is_configured": true, 00:21:23.639 "data_offset": 0, 00:21:23.639 "data_size": 65536 00:21:23.639 }, 00:21:23.639 { 00:21:23.639 "name": "BaseBdev4", 00:21:23.639 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:23.639 "is_configured": true, 00:21:23.639 "data_offset": 0, 00:21:23.639 "data_size": 65536 00:21:23.639 } 00:21:23.639 ] 00:21:23.639 }' 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # local timeout=726 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.639 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.896 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:23.896 "name": "raid_bdev1", 00:21:23.896 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:23.896 "strip_size_kb": 0, 00:21:23.896 "state": "online", 00:21:23.896 "raid_level": "raid1", 00:21:23.896 "superblock": false, 00:21:23.896 "num_base_bdevs": 4, 00:21:23.896 "num_base_bdevs_discovered": 3, 00:21:23.896 "num_base_bdevs_operational": 3, 00:21:23.896 "process": { 00:21:23.896 "type": "rebuild", 00:21:23.896 "target": "spare", 00:21:23.896 "progress": { 00:21:23.896 "blocks": 43008, 00:21:23.896 "percent": 65 00:21:23.896 } 00:21:23.896 }, 00:21:23.896 "base_bdevs_list": [ 00:21:23.896 { 00:21:23.896 "name": "spare", 00:21:23.896 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:23.896 "is_configured": true, 00:21:23.896 "data_offset": 0, 00:21:23.896 "data_size": 65536 00:21:23.896 }, 00:21:23.896 { 00:21:23.896 "name": null, 00:21:23.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.896 "is_configured": false, 00:21:23.897 "data_offset": 0, 00:21:23.897 "data_size": 65536 00:21:23.897 }, 00:21:23.897 { 00:21:23.897 "name": "BaseBdev3", 00:21:23.897 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:23.897 "is_configured": true, 00:21:23.897 "data_offset": 0, 00:21:23.897 "data_size": 65536 00:21:23.897 }, 00:21:23.897 { 00:21:23.897 "name": "BaseBdev4", 00:21:23.897 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:23.897 "is_configured": true, 00:21:23.897 "data_offset": 0, 00:21:23.897 "data_size": 65536 00:21:23.897 } 00:21:23.897 ] 00:21:23.897 }' 00:21:23.897 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:23.897 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:23.897 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:23.897 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:23.897 11:57:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:24.829 [2024-05-14 11:57:51.863161] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:24.829 [2024-05-14 11:57:51.863222] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:24.829 [2024-05-14 11:57:51.863259] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.088 11:57:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:25.347 "name": "raid_bdev1", 00:21:25.347 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:25.347 "strip_size_kb": 0, 00:21:25.347 "state": "online", 00:21:25.347 "raid_level": "raid1", 00:21:25.347 "superblock": false, 00:21:25.347 "num_base_bdevs": 4, 00:21:25.347 "num_base_bdevs_discovered": 3, 00:21:25.347 "num_base_bdevs_operational": 3, 00:21:25.347 "base_bdevs_list": [ 00:21:25.347 { 00:21:25.347 "name": "spare", 00:21:25.347 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:25.347 "is_configured": true, 00:21:25.347 "data_offset": 0, 00:21:25.347 "data_size": 65536 00:21:25.347 }, 00:21:25.347 { 00:21:25.347 "name": null, 00:21:25.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.347 "is_configured": false, 00:21:25.347 "data_offset": 0, 00:21:25.347 "data_size": 65536 00:21:25.347 }, 00:21:25.347 { 00:21:25.347 "name": "BaseBdev3", 00:21:25.347 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:25.347 "is_configured": true, 00:21:25.347 "data_offset": 0, 00:21:25.347 "data_size": 65536 00:21:25.347 }, 00:21:25.347 { 00:21:25.347 "name": "BaseBdev4", 00:21:25.347 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:25.347 "is_configured": true, 00:21:25.347 "data_offset": 0, 00:21:25.347 "data_size": 65536 00:21:25.347 } 00:21:25.347 ] 00:21:25.347 }' 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # break 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.347 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:25.605 "name": "raid_bdev1", 00:21:25.605 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:25.605 "strip_size_kb": 0, 00:21:25.605 "state": "online", 00:21:25.605 "raid_level": "raid1", 00:21:25.605 "superblock": false, 00:21:25.605 "num_base_bdevs": 4, 00:21:25.605 "num_base_bdevs_discovered": 3, 00:21:25.605 "num_base_bdevs_operational": 3, 00:21:25.605 "base_bdevs_list": [ 00:21:25.605 { 00:21:25.605 "name": "spare", 00:21:25.605 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:25.605 "is_configured": true, 00:21:25.605 "data_offset": 0, 00:21:25.605 "data_size": 65536 00:21:25.605 }, 00:21:25.605 { 00:21:25.605 "name": null, 00:21:25.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.605 "is_configured": false, 00:21:25.605 "data_offset": 0, 00:21:25.605 "data_size": 65536 00:21:25.605 }, 00:21:25.605 { 00:21:25.605 "name": "BaseBdev3", 00:21:25.605 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:25.605 "is_configured": true, 00:21:25.605 "data_offset": 0, 00:21:25.605 "data_size": 65536 00:21:25.605 }, 00:21:25.605 { 00:21:25.605 "name": "BaseBdev4", 00:21:25.605 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:25.605 "is_configured": true, 00:21:25.605 "data_offset": 0, 00:21:25.605 "data_size": 65536 00:21:25.605 } 00:21:25.605 ] 00:21:25.605 }' 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.605 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.863 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:25.863 "name": "raid_bdev1", 00:21:25.863 "uuid": "64394243-771a-4205-82c4-5eba91c334bc", 00:21:25.863 "strip_size_kb": 0, 00:21:25.863 "state": "online", 00:21:25.863 "raid_level": "raid1", 00:21:25.863 "superblock": false, 00:21:25.863 "num_base_bdevs": 4, 00:21:25.863 "num_base_bdevs_discovered": 3, 00:21:25.863 "num_base_bdevs_operational": 3, 00:21:25.863 "base_bdevs_list": [ 00:21:25.863 { 00:21:25.863 "name": "spare", 00:21:25.863 "uuid": "d74d5b77-13c8-5ff3-a3e4-4aa4a5121bc6", 00:21:25.863 "is_configured": true, 00:21:25.863 "data_offset": 0, 00:21:25.863 "data_size": 65536 00:21:25.863 }, 00:21:25.863 { 00:21:25.863 "name": null, 00:21:25.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.863 "is_configured": false, 00:21:25.863 "data_offset": 0, 00:21:25.863 "data_size": 65536 00:21:25.863 }, 00:21:25.863 { 00:21:25.863 "name": "BaseBdev3", 00:21:25.863 "uuid": "7d5ed690-4df8-5a32-838f-278928e22647", 00:21:25.863 "is_configured": true, 00:21:25.863 "data_offset": 0, 00:21:25.863 "data_size": 65536 00:21:25.863 }, 00:21:25.863 { 00:21:25.863 "name": "BaseBdev4", 00:21:25.863 "uuid": "da54d7b6-b612-5eed-b4a7-23f2229a20b4", 00:21:25.863 "is_configured": true, 00:21:25.863 "data_offset": 0, 00:21:25.863 "data_size": 65536 00:21:25.863 } 00:21:25.863 ] 00:21:25.863 }' 00:21:25.863 11:57:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:25.863 11:57:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:26.430 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:26.689 [2024-05-14 11:57:53.599560] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:26.689 [2024-05-14 11:57:53.599587] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:26.689 [2024-05-14 11:57:53.599645] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:26.689 [2024-05-14 11:57:53.599715] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:26.689 [2024-05-14 11:57:53.599728] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19129a0 name raid_bdev1, state offline 00:21:26.689 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.689 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # jq length 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:26.948 11:57:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:27.207 /dev/nbd0 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:27.207 1+0 records in 00:21:27.207 1+0 records out 00:21:27.207 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269208 s, 15.2 MB/s 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:27.207 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:27.467 /dev/nbd1 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:27.467 1+0 records in 00:21:27.467 1+0 records out 00:21:27.467 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034315 s, 11.9 MB/s 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:27.467 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:27.726 11:57:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@795 -- # killprocess 1765959 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 1765959 ']' 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 1765959 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1765959 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1765959' 00:21:28.295 killing process with pid 1765959 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 1765959 00:21:28.295 Received shutdown signal, test time was about 60.000000 seconds 00:21:28.295 00:21:28.295 Latency(us) 00:21:28.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:28.295 =================================================================================================================== 00:21:28.295 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:28.295 [2024-05-14 11:57:55.134172] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:28.295 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 1765959 00:21:28.295 [2024-05-14 11:57:55.184668] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@797 -- # return 0 00:21:28.554 00:21:28.554 real 0m23.603s 00:21:28.554 user 0m32.142s 00:21:28.554 sys 0m4.959s 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.554 ************************************ 00:21:28.554 END TEST raid_rebuild_test 00:21:28.554 ************************************ 00:21:28.554 11:57:55 bdev_raid -- bdev/bdev_raid.sh@824 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:21:28.554 11:57:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:21:28.554 11:57:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:28.554 11:57:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:28.554 ************************************ 00:21:28.554 START TEST raid_rebuild_test_sb 00:21:28.554 ************************************ 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true false true 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local verify=true 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # local strip_size 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@582 -- # local create_arg 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local data_offset 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # raid_pid=1769239 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # waitforlisten 1769239 /var/tmp/spdk-raid.sock 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 1769239 ']' 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:28.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:28.554 11:57:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:28.554 [2024-05-14 11:57:55.576707] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:21:28.554 [2024-05-14 11:57:55.576774] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1769239 ] 00:21:28.554 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:28.554 Zero copy mechanism will not be used. 00:21:28.813 [2024-05-14 11:57:55.691114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:28.813 [2024-05-14 11:57:55.798223] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:28.813 [2024-05-14 11:57:55.856799] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:28.813 [2024-05-14 11:57:55.856834] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:29.750 11:57:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:29.750 11:57:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:21:29.750 11:57:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:29.750 11:57:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:29.750 BaseBdev1_malloc 00:21:29.750 11:57:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:30.009 [2024-05-14 11:57:56.972966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:30.009 [2024-05-14 11:57:56.973013] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.009 [2024-05-14 11:57:56.973035] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa15960 00:21:30.009 [2024-05-14 11:57:56.973048] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.009 [2024-05-14 11:57:56.974785] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.009 [2024-05-14 11:57:56.974815] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:30.009 BaseBdev1 00:21:30.009 11:57:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:30.009 11:57:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:30.268 BaseBdev2_malloc 00:21:30.268 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:30.527 [2024-05-14 11:57:57.406964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:30.527 [2024-05-14 11:57:57.407012] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.527 [2024-05-14 11:57:57.407036] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbc8b40 00:21:30.527 [2024-05-14 11:57:57.407048] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.527 [2024-05-14 11:57:57.408533] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.527 [2024-05-14 11:57:57.408561] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:30.527 BaseBdev2 00:21:30.527 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:30.527 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:30.785 BaseBdev3_malloc 00:21:30.785 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:31.044 [2024-05-14 11:57:57.900894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:31.044 [2024-05-14 11:57:57.900942] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.044 [2024-05-14 11:57:57.900970] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa0f4c0 00:21:31.044 [2024-05-14 11:57:57.900982] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.044 [2024-05-14 11:57:57.902486] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.044 [2024-05-14 11:57:57.902515] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:31.044 BaseBdev3 00:21:31.044 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:21:31.044 11:57:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:31.303 BaseBdev4_malloc 00:21:31.303 11:57:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:31.561 [2024-05-14 11:57:58.398907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:31.561 [2024-05-14 11:57:58.398953] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.561 [2024-05-14 11:57:58.398971] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa11ec0 00:21:31.561 [2024-05-14 11:57:58.398983] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.561 [2024-05-14 11:57:58.400377] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.561 [2024-05-14 11:57:58.400413] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:31.561 BaseBdev4 00:21:31.561 11:57:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:31.819 spare_malloc 00:21:31.819 11:57:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:31.819 spare_delay 00:21:32.079 11:57:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:32.079 [2024-05-14 11:57:59.129684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:32.079 [2024-05-14 11:57:59.129730] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.079 [2024-05-14 11:57:59.129749] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa12ec0 00:21:32.079 [2024-05-14 11:57:59.129762] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.079 [2024-05-14 11:57:59.131225] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.079 [2024-05-14 11:57:59.131253] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:32.079 spare 00:21:32.079 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:32.338 [2024-05-14 11:57:59.370350] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:32.338 [2024-05-14 11:57:59.371546] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:32.338 [2024-05-14 11:57:59.371608] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:32.338 [2024-05-14 11:57:59.371655] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:32.338 [2024-05-14 11:57:59.371843] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xa149a0 00:21:32.338 [2024-05-14 11:57:59.371855] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:32.338 [2024-05-14 11:57:59.372036] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa11990 00:21:32.338 [2024-05-14 11:57:59.372183] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa149a0 00:21:32.338 [2024-05-14 11:57:59.372197] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa149a0 00:21:32.338 [2024-05-14 11:57:59.372289] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.338 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.597 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:32.597 "name": "raid_bdev1", 00:21:32.598 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:32.598 "strip_size_kb": 0, 00:21:32.598 "state": "online", 00:21:32.598 "raid_level": "raid1", 00:21:32.598 "superblock": true, 00:21:32.598 "num_base_bdevs": 4, 00:21:32.598 "num_base_bdevs_discovered": 4, 00:21:32.598 "num_base_bdevs_operational": 4, 00:21:32.598 "base_bdevs_list": [ 00:21:32.598 { 00:21:32.598 "name": "BaseBdev1", 00:21:32.598 "uuid": "6f4741fb-afae-57d3-8735-c1e7e180e14f", 00:21:32.598 "is_configured": true, 00:21:32.598 "data_offset": 2048, 00:21:32.598 "data_size": 63488 00:21:32.598 }, 00:21:32.598 { 00:21:32.598 "name": "BaseBdev2", 00:21:32.598 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:32.598 "is_configured": true, 00:21:32.598 "data_offset": 2048, 00:21:32.598 "data_size": 63488 00:21:32.598 }, 00:21:32.598 { 00:21:32.598 "name": "BaseBdev3", 00:21:32.598 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:32.598 "is_configured": true, 00:21:32.598 "data_offset": 2048, 00:21:32.598 "data_size": 63488 00:21:32.598 }, 00:21:32.598 { 00:21:32.598 "name": "BaseBdev4", 00:21:32.598 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:32.598 "is_configured": true, 00:21:32.598 "data_offset": 2048, 00:21:32.598 "data_size": 63488 00:21:32.598 } 00:21:32.598 ] 00:21:32.598 }' 00:21:32.598 11:57:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:32.598 11:57:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:33.166 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:33.166 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:21:33.424 [2024-05-14 11:58:00.465511] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:33.424 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:21:33.425 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.425 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:33.707 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:33.983 [2024-05-14 11:58:00.950574] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa11990 00:21:33.983 /dev/nbd0 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:33.983 11:58:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:33.983 1+0 records in 00:21:33.983 1+0 records out 00:21:33.983 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293564 s, 14.0 MB/s 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:21:33.983 11:58:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:21:42.107 63488+0 records in 00:21:42.107 63488+0 records out 00:21:42.107 32505856 bytes (33 MB, 31 MiB) copied, 7.25474 s, 4.5 MB/s 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:42.107 [2024-05-14 11:58:08.542039] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:42.107 [2024-05-14 11:58:08.781877] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.107 11:58:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.107 11:58:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:42.107 "name": "raid_bdev1", 00:21:42.107 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:42.107 "strip_size_kb": 0, 00:21:42.107 "state": "online", 00:21:42.107 "raid_level": "raid1", 00:21:42.107 "superblock": true, 00:21:42.107 "num_base_bdevs": 4, 00:21:42.107 "num_base_bdevs_discovered": 3, 00:21:42.107 "num_base_bdevs_operational": 3, 00:21:42.108 "base_bdevs_list": [ 00:21:42.108 { 00:21:42.108 "name": null, 00:21:42.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.108 "is_configured": false, 00:21:42.108 "data_offset": 2048, 00:21:42.108 "data_size": 63488 00:21:42.108 }, 00:21:42.108 { 00:21:42.108 "name": "BaseBdev2", 00:21:42.108 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:42.108 "is_configured": true, 00:21:42.108 "data_offset": 2048, 00:21:42.108 "data_size": 63488 00:21:42.108 }, 00:21:42.108 { 00:21:42.108 "name": "BaseBdev3", 00:21:42.108 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:42.108 "is_configured": true, 00:21:42.108 "data_offset": 2048, 00:21:42.108 "data_size": 63488 00:21:42.108 }, 00:21:42.108 { 00:21:42.108 "name": "BaseBdev4", 00:21:42.108 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:42.108 "is_configured": true, 00:21:42.108 "data_offset": 2048, 00:21:42.108 "data_size": 63488 00:21:42.108 } 00:21:42.108 ] 00:21:42.108 }' 00:21:42.108 11:58:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:42.108 11:58:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:42.676 11:58:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:42.935 [2024-05-14 11:58:09.860747] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:42.935 [2024-05-14 11:58:09.864824] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa0d040 00:21:42.935 [2024-05-14 11:58:09.867191] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:42.935 11:58:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # sleep 1 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.873 11:58:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.132 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:44.132 "name": "raid_bdev1", 00:21:44.132 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:44.132 "strip_size_kb": 0, 00:21:44.132 "state": "online", 00:21:44.132 "raid_level": "raid1", 00:21:44.132 "superblock": true, 00:21:44.132 "num_base_bdevs": 4, 00:21:44.132 "num_base_bdevs_discovered": 4, 00:21:44.132 "num_base_bdevs_operational": 4, 00:21:44.132 "process": { 00:21:44.132 "type": "rebuild", 00:21:44.132 "target": "spare", 00:21:44.132 "progress": { 00:21:44.132 "blocks": 24576, 00:21:44.132 "percent": 38 00:21:44.132 } 00:21:44.132 }, 00:21:44.132 "base_bdevs_list": [ 00:21:44.132 { 00:21:44.132 "name": "spare", 00:21:44.132 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:44.132 "is_configured": true, 00:21:44.132 "data_offset": 2048, 00:21:44.132 "data_size": 63488 00:21:44.132 }, 00:21:44.132 { 00:21:44.132 "name": "BaseBdev2", 00:21:44.132 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:44.132 "is_configured": true, 00:21:44.132 "data_offset": 2048, 00:21:44.132 "data_size": 63488 00:21:44.132 }, 00:21:44.132 { 00:21:44.132 "name": "BaseBdev3", 00:21:44.132 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:44.132 "is_configured": true, 00:21:44.132 "data_offset": 2048, 00:21:44.132 "data_size": 63488 00:21:44.132 }, 00:21:44.132 { 00:21:44.132 "name": "BaseBdev4", 00:21:44.132 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:44.132 "is_configured": true, 00:21:44.132 "data_offset": 2048, 00:21:44.132 "data_size": 63488 00:21:44.132 } 00:21:44.132 ] 00:21:44.132 }' 00:21:44.132 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:44.132 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:44.132 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:44.391 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:44.391 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:44.391 [2024-05-14 11:58:11.446354] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:44.651 [2024-05-14 11:58:11.479908] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:44.651 [2024-05-14 11:58:11.479954] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.651 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.911 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:44.911 "name": "raid_bdev1", 00:21:44.911 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:44.911 "strip_size_kb": 0, 00:21:44.911 "state": "online", 00:21:44.911 "raid_level": "raid1", 00:21:44.911 "superblock": true, 00:21:44.911 "num_base_bdevs": 4, 00:21:44.911 "num_base_bdevs_discovered": 3, 00:21:44.911 "num_base_bdevs_operational": 3, 00:21:44.911 "base_bdevs_list": [ 00:21:44.911 { 00:21:44.911 "name": null, 00:21:44.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.911 "is_configured": false, 00:21:44.911 "data_offset": 2048, 00:21:44.911 "data_size": 63488 00:21:44.911 }, 00:21:44.911 { 00:21:44.911 "name": "BaseBdev2", 00:21:44.911 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:44.911 "is_configured": true, 00:21:44.911 "data_offset": 2048, 00:21:44.911 "data_size": 63488 00:21:44.911 }, 00:21:44.911 { 00:21:44.911 "name": "BaseBdev3", 00:21:44.911 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:44.911 "is_configured": true, 00:21:44.911 "data_offset": 2048, 00:21:44.911 "data_size": 63488 00:21:44.911 }, 00:21:44.911 { 00:21:44.911 "name": "BaseBdev4", 00:21:44.911 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:44.911 "is_configured": true, 00:21:44.911 "data_offset": 2048, 00:21:44.911 "data_size": 63488 00:21:44.911 } 00:21:44.911 ] 00:21:44.911 }' 00:21:44.911 11:58:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:44.911 11:58:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.480 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:45.739 "name": "raid_bdev1", 00:21:45.739 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:45.739 "strip_size_kb": 0, 00:21:45.739 "state": "online", 00:21:45.739 "raid_level": "raid1", 00:21:45.739 "superblock": true, 00:21:45.739 "num_base_bdevs": 4, 00:21:45.739 "num_base_bdevs_discovered": 3, 00:21:45.739 "num_base_bdevs_operational": 3, 00:21:45.739 "base_bdevs_list": [ 00:21:45.739 { 00:21:45.739 "name": null, 00:21:45.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.739 "is_configured": false, 00:21:45.739 "data_offset": 2048, 00:21:45.739 "data_size": 63488 00:21:45.739 }, 00:21:45.739 { 00:21:45.739 "name": "BaseBdev2", 00:21:45.739 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:45.739 "is_configured": true, 00:21:45.739 "data_offset": 2048, 00:21:45.739 "data_size": 63488 00:21:45.739 }, 00:21:45.739 { 00:21:45.739 "name": "BaseBdev3", 00:21:45.739 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:45.739 "is_configured": true, 00:21:45.739 "data_offset": 2048, 00:21:45.739 "data_size": 63488 00:21:45.739 }, 00:21:45.739 { 00:21:45.739 "name": "BaseBdev4", 00:21:45.739 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:45.739 "is_configured": true, 00:21:45.739 "data_offset": 2048, 00:21:45.739 "data_size": 63488 00:21:45.739 } 00:21:45.739 ] 00:21:45.739 }' 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:45.739 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:45.998 [2024-05-14 11:58:12.899710] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:45.998 [2024-05-14 11:58:12.904364] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x71d090 00:21:45.998 [2024-05-14 11:58:12.905923] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:45.998 11:58:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # sleep 1 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.935 11:58:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.194 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:47.194 "name": "raid_bdev1", 00:21:47.194 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:47.194 "strip_size_kb": 0, 00:21:47.194 "state": "online", 00:21:47.194 "raid_level": "raid1", 00:21:47.194 "superblock": true, 00:21:47.194 "num_base_bdevs": 4, 00:21:47.194 "num_base_bdevs_discovered": 4, 00:21:47.194 "num_base_bdevs_operational": 4, 00:21:47.194 "process": { 00:21:47.194 "type": "rebuild", 00:21:47.194 "target": "spare", 00:21:47.194 "progress": { 00:21:47.194 "blocks": 24576, 00:21:47.194 "percent": 38 00:21:47.194 } 00:21:47.194 }, 00:21:47.194 "base_bdevs_list": [ 00:21:47.194 { 00:21:47.194 "name": "spare", 00:21:47.194 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:47.194 "is_configured": true, 00:21:47.194 "data_offset": 2048, 00:21:47.194 "data_size": 63488 00:21:47.194 }, 00:21:47.194 { 00:21:47.194 "name": "BaseBdev2", 00:21:47.194 "uuid": "68f7bbe3-2de0-5d28-a922-b27fa5067633", 00:21:47.194 "is_configured": true, 00:21:47.194 "data_offset": 2048, 00:21:47.194 "data_size": 63488 00:21:47.194 }, 00:21:47.194 { 00:21:47.194 "name": "BaseBdev3", 00:21:47.194 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:47.194 "is_configured": true, 00:21:47.194 "data_offset": 2048, 00:21:47.194 "data_size": 63488 00:21:47.194 }, 00:21:47.194 { 00:21:47.194 "name": "BaseBdev4", 00:21:47.194 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:47.194 "is_configured": true, 00:21:47.194 "data_offset": 2048, 00:21:47.194 "data_size": 63488 00:21:47.194 } 00:21:47.194 ] 00:21:47.194 }' 00:21:47.194 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:21:47.195 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:21:47.195 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:47.454 [2024-05-14 11:58:14.482302] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:47.454 [2024-05-14 11:58:14.518368] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x71d090 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.713 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:47.973 "name": "raid_bdev1", 00:21:47.973 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:47.973 "strip_size_kb": 0, 00:21:47.973 "state": "online", 00:21:47.973 "raid_level": "raid1", 00:21:47.973 "superblock": true, 00:21:47.973 "num_base_bdevs": 4, 00:21:47.973 "num_base_bdevs_discovered": 3, 00:21:47.973 "num_base_bdevs_operational": 3, 00:21:47.973 "process": { 00:21:47.973 "type": "rebuild", 00:21:47.973 "target": "spare", 00:21:47.973 "progress": { 00:21:47.973 "blocks": 38912, 00:21:47.973 "percent": 61 00:21:47.973 } 00:21:47.973 }, 00:21:47.973 "base_bdevs_list": [ 00:21:47.973 { 00:21:47.973 "name": "spare", 00:21:47.973 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:47.973 "is_configured": true, 00:21:47.973 "data_offset": 2048, 00:21:47.973 "data_size": 63488 00:21:47.973 }, 00:21:47.973 { 00:21:47.973 "name": null, 00:21:47.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.973 "is_configured": false, 00:21:47.973 "data_offset": 2048, 00:21:47.973 "data_size": 63488 00:21:47.973 }, 00:21:47.973 { 00:21:47.973 "name": "BaseBdev3", 00:21:47.973 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:47.973 "is_configured": true, 00:21:47.973 "data_offset": 2048, 00:21:47.973 "data_size": 63488 00:21:47.973 }, 00:21:47.973 { 00:21:47.973 "name": "BaseBdev4", 00:21:47.973 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:47.973 "is_configured": true, 00:21:47.973 "data_offset": 2048, 00:21:47.973 "data_size": 63488 00:21:47.973 } 00:21:47.973 ] 00:21:47.973 }' 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # local timeout=750 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.973 11:58:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:48.233 "name": "raid_bdev1", 00:21:48.233 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:48.233 "strip_size_kb": 0, 00:21:48.233 "state": "online", 00:21:48.233 "raid_level": "raid1", 00:21:48.233 "superblock": true, 00:21:48.233 "num_base_bdevs": 4, 00:21:48.233 "num_base_bdevs_discovered": 3, 00:21:48.233 "num_base_bdevs_operational": 3, 00:21:48.233 "process": { 00:21:48.233 "type": "rebuild", 00:21:48.233 "target": "spare", 00:21:48.233 "progress": { 00:21:48.233 "blocks": 45056, 00:21:48.233 "percent": 70 00:21:48.233 } 00:21:48.233 }, 00:21:48.233 "base_bdevs_list": [ 00:21:48.233 { 00:21:48.233 "name": "spare", 00:21:48.233 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:48.233 "is_configured": true, 00:21:48.233 "data_offset": 2048, 00:21:48.233 "data_size": 63488 00:21:48.233 }, 00:21:48.233 { 00:21:48.233 "name": null, 00:21:48.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.233 "is_configured": false, 00:21:48.233 "data_offset": 2048, 00:21:48.233 "data_size": 63488 00:21:48.233 }, 00:21:48.233 { 00:21:48.233 "name": "BaseBdev3", 00:21:48.233 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:48.233 "is_configured": true, 00:21:48.233 "data_offset": 2048, 00:21:48.233 "data_size": 63488 00:21:48.233 }, 00:21:48.233 { 00:21:48.233 "name": "BaseBdev4", 00:21:48.233 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:48.233 "is_configured": true, 00:21:48.233 "data_offset": 2048, 00:21:48.233 "data_size": 63488 00:21:48.233 } 00:21:48.233 ] 00:21:48.233 }' 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:48.233 11:58:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # sleep 1 00:21:49.170 [2024-05-14 11:58:16.030195] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:49.170 [2024-05-14 11:58:16.030254] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:49.170 [2024-05-14 11:58:16.030352] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.430 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:49.689 "name": "raid_bdev1", 00:21:49.689 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:49.689 "strip_size_kb": 0, 00:21:49.689 "state": "online", 00:21:49.689 "raid_level": "raid1", 00:21:49.689 "superblock": true, 00:21:49.689 "num_base_bdevs": 4, 00:21:49.689 "num_base_bdevs_discovered": 3, 00:21:49.689 "num_base_bdevs_operational": 3, 00:21:49.689 "base_bdevs_list": [ 00:21:49.689 { 00:21:49.689 "name": "spare", 00:21:49.689 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:49.689 "is_configured": true, 00:21:49.689 "data_offset": 2048, 00:21:49.689 "data_size": 63488 00:21:49.689 }, 00:21:49.689 { 00:21:49.689 "name": null, 00:21:49.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.689 "is_configured": false, 00:21:49.689 "data_offset": 2048, 00:21:49.689 "data_size": 63488 00:21:49.689 }, 00:21:49.689 { 00:21:49.689 "name": "BaseBdev3", 00:21:49.689 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:49.689 "is_configured": true, 00:21:49.689 "data_offset": 2048, 00:21:49.689 "data_size": 63488 00:21:49.689 }, 00:21:49.689 { 00:21:49.689 "name": "BaseBdev4", 00:21:49.689 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:49.689 "is_configured": true, 00:21:49.689 "data_offset": 2048, 00:21:49.689 "data_size": 63488 00:21:49.689 } 00:21:49.689 ] 00:21:49.689 }' 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # break 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.689 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:49.949 "name": "raid_bdev1", 00:21:49.949 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:49.949 "strip_size_kb": 0, 00:21:49.949 "state": "online", 00:21:49.949 "raid_level": "raid1", 00:21:49.949 "superblock": true, 00:21:49.949 "num_base_bdevs": 4, 00:21:49.949 "num_base_bdevs_discovered": 3, 00:21:49.949 "num_base_bdevs_operational": 3, 00:21:49.949 "base_bdevs_list": [ 00:21:49.949 { 00:21:49.949 "name": "spare", 00:21:49.949 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:49.949 "is_configured": true, 00:21:49.949 "data_offset": 2048, 00:21:49.949 "data_size": 63488 00:21:49.949 }, 00:21:49.949 { 00:21:49.949 "name": null, 00:21:49.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.949 "is_configured": false, 00:21:49.949 "data_offset": 2048, 00:21:49.949 "data_size": 63488 00:21:49.949 }, 00:21:49.949 { 00:21:49.949 "name": "BaseBdev3", 00:21:49.949 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:49.949 "is_configured": true, 00:21:49.949 "data_offset": 2048, 00:21:49.949 "data_size": 63488 00:21:49.949 }, 00:21:49.949 { 00:21:49.949 "name": "BaseBdev4", 00:21:49.949 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:49.949 "is_configured": true, 00:21:49.949 "data_offset": 2048, 00:21:49.949 "data_size": 63488 00:21:49.949 } 00:21:49.949 ] 00:21:49.949 }' 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.949 11:58:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.208 11:58:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:50.208 "name": "raid_bdev1", 00:21:50.208 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:50.208 "strip_size_kb": 0, 00:21:50.208 "state": "online", 00:21:50.208 "raid_level": "raid1", 00:21:50.208 "superblock": true, 00:21:50.208 "num_base_bdevs": 4, 00:21:50.208 "num_base_bdevs_discovered": 3, 00:21:50.208 "num_base_bdevs_operational": 3, 00:21:50.208 "base_bdevs_list": [ 00:21:50.208 { 00:21:50.208 "name": "spare", 00:21:50.208 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:50.208 "is_configured": true, 00:21:50.208 "data_offset": 2048, 00:21:50.208 "data_size": 63488 00:21:50.208 }, 00:21:50.208 { 00:21:50.208 "name": null, 00:21:50.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.208 "is_configured": false, 00:21:50.208 "data_offset": 2048, 00:21:50.208 "data_size": 63488 00:21:50.208 }, 00:21:50.208 { 00:21:50.208 "name": "BaseBdev3", 00:21:50.208 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:50.208 "is_configured": true, 00:21:50.208 "data_offset": 2048, 00:21:50.208 "data_size": 63488 00:21:50.208 }, 00:21:50.208 { 00:21:50.208 "name": "BaseBdev4", 00:21:50.208 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:50.208 "is_configured": true, 00:21:50.208 "data_offset": 2048, 00:21:50.208 "data_size": 63488 00:21:50.208 } 00:21:50.208 ] 00:21:50.208 }' 00:21:50.208 11:58:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:50.208 11:58:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:50.777 11:58:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:51.035 [2024-05-14 11:58:18.035228] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:51.035 [2024-05-14 11:58:18.035261] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:51.035 [2024-05-14 11:58:18.035322] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:51.035 [2024-05-14 11:58:18.035393] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:51.035 [2024-05-14 11:58:18.035413] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa149a0 name raid_bdev1, state offline 00:21:51.035 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.035 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # jq length 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:51.295 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:51.554 /dev/nbd0 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:51.554 1+0 records in 00:21:51.554 1+0 records out 00:21:51.554 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277731 s, 14.7 MB/s 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:51.554 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:51.812 /dev/nbd1 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:51.812 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:51.813 1+0 records in 00:21:51.813 1+0 records out 00:21:51.813 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344278 s, 11.9 MB/s 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:51.813 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:52.072 11:58:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:52.330 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:21:52.589 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:52.848 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:53.107 [2024-05-14 11:58:19.943127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:53.107 [2024-05-14 11:58:19.943179] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.107 [2024-05-14 11:58:19.943200] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa16620 00:21:53.107 [2024-05-14 11:58:19.943214] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.107 [2024-05-14 11:58:19.944842] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.107 [2024-05-14 11:58:19.944872] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:53.107 [2024-05-14 11:58:19.944944] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:53.107 [2024-05-14 11:58:19.944975] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:53.107 BaseBdev1 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z '' ']' 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # continue 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev3 ']' 00:21:53.107 11:58:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev3 00:21:53.366 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:53.366 [2024-05-14 11:58:20.436423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:53.366 [2024-05-14 11:58:20.436460] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.366 [2024-05-14 11:58:20.436480] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa172d0 00:21:53.366 [2024-05-14 11:58:20.436492] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.366 [2024-05-14 11:58:20.436825] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.366 [2024-05-14 11:58:20.436849] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:53.366 [2024-05-14 11:58:20.436909] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev3 00:21:53.366 [2024-05-14 11:58:20.436921] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev3 (4) greater than existing raid bdev raid_bdev1 (1) 00:21:53.366 [2024-05-14 11:58:20.436931] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:53.366 [2024-05-14 11:58:20.436946] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb8180 name raid_bdev1, state configuring 00:21:53.366 [2024-05-14 11:58:20.436977] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:53.366 BaseBdev3 00:21:53.625 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:21:53.625 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev4 ']' 00:21:53.625 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev4 00:21:53.625 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:53.884 [2024-05-14 11:58:20.925910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:53.884 [2024-05-14 11:58:20.925951] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.885 [2024-05-14 11:58:20.925970] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa0dda0 00:21:53.885 [2024-05-14 11:58:20.925982] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.885 [2024-05-14 11:58:20.926295] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.885 [2024-05-14 11:58:20.926311] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:53.885 [2024-05-14 11:58:20.926364] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev4 00:21:53.885 [2024-05-14 11:58:20.926381] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:53.885 BaseBdev4 00:21:53.885 11:58:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:54.145 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:54.404 [2024-05-14 11:58:21.415194] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:54.404 [2024-05-14 11:58:21.415228] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.404 [2024-05-14 11:58:21.415246] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa16320 00:21:54.404 [2024-05-14 11:58:21.415258] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.404 [2024-05-14 11:58:21.415614] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.404 [2024-05-14 11:58:21.415634] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:54.404 [2024-05-14 11:58:21.415689] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:21:54.404 [2024-05-14 11:58:21.415707] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:54.404 spare 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.404 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.663 [2024-05-14 11:58:21.516055] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xbb1910 00:21:54.663 [2024-05-14 11:58:21.516072] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:54.663 [2024-05-14 11:58:21.516266] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa165f0 00:21:54.663 [2024-05-14 11:58:21.516425] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbb1910 00:21:54.663 [2024-05-14 11:58:21.516436] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbb1910 00:21:54.663 [2024-05-14 11:58:21.516547] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.663 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:54.663 "name": "raid_bdev1", 00:21:54.663 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:54.663 "strip_size_kb": 0, 00:21:54.663 "state": "online", 00:21:54.663 "raid_level": "raid1", 00:21:54.663 "superblock": true, 00:21:54.663 "num_base_bdevs": 4, 00:21:54.663 "num_base_bdevs_discovered": 3, 00:21:54.663 "num_base_bdevs_operational": 3, 00:21:54.663 "base_bdevs_list": [ 00:21:54.663 { 00:21:54.663 "name": "spare", 00:21:54.663 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:54.663 "is_configured": true, 00:21:54.663 "data_offset": 2048, 00:21:54.663 "data_size": 63488 00:21:54.663 }, 00:21:54.663 { 00:21:54.663 "name": null, 00:21:54.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.663 "is_configured": false, 00:21:54.663 "data_offset": 2048, 00:21:54.663 "data_size": 63488 00:21:54.663 }, 00:21:54.663 { 00:21:54.663 "name": "BaseBdev3", 00:21:54.663 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:54.663 "is_configured": true, 00:21:54.663 "data_offset": 2048, 00:21:54.663 "data_size": 63488 00:21:54.663 }, 00:21:54.663 { 00:21:54.663 "name": "BaseBdev4", 00:21:54.663 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:54.663 "is_configured": true, 00:21:54.663 "data_offset": 2048, 00:21:54.663 "data_size": 63488 00:21:54.663 } 00:21:54.663 ] 00:21:54.663 }' 00:21:54.663 11:58:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:54.663 11:58:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.230 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.519 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:55.519 "name": "raid_bdev1", 00:21:55.519 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:55.519 "strip_size_kb": 0, 00:21:55.519 "state": "online", 00:21:55.519 "raid_level": "raid1", 00:21:55.519 "superblock": true, 00:21:55.519 "num_base_bdevs": 4, 00:21:55.519 "num_base_bdevs_discovered": 3, 00:21:55.519 "num_base_bdevs_operational": 3, 00:21:55.519 "base_bdevs_list": [ 00:21:55.519 { 00:21:55.519 "name": "spare", 00:21:55.519 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:55.519 "is_configured": true, 00:21:55.519 "data_offset": 2048, 00:21:55.519 "data_size": 63488 00:21:55.519 }, 00:21:55.519 { 00:21:55.519 "name": null, 00:21:55.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.519 "is_configured": false, 00:21:55.519 "data_offset": 2048, 00:21:55.519 "data_size": 63488 00:21:55.519 }, 00:21:55.519 { 00:21:55.519 "name": "BaseBdev3", 00:21:55.519 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:55.519 "is_configured": true, 00:21:55.519 "data_offset": 2048, 00:21:55.519 "data_size": 63488 00:21:55.519 }, 00:21:55.519 { 00:21:55.519 "name": "BaseBdev4", 00:21:55.519 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:55.519 "is_configured": true, 00:21:55.519 "data_offset": 2048, 00:21:55.519 "data_size": 63488 00:21:55.519 } 00:21:55.519 ] 00:21:55.519 }' 00:21:55.519 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:55.519 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:55.519 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:55.795 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:55.795 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.795 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:55.795 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:21:55.795 11:58:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:56.053 [2024-05-14 11:58:23.007829] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.053 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.311 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:56.311 "name": "raid_bdev1", 00:21:56.311 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:56.311 "strip_size_kb": 0, 00:21:56.311 "state": "online", 00:21:56.311 "raid_level": "raid1", 00:21:56.311 "superblock": true, 00:21:56.311 "num_base_bdevs": 4, 00:21:56.311 "num_base_bdevs_discovered": 2, 00:21:56.311 "num_base_bdevs_operational": 2, 00:21:56.311 "base_bdevs_list": [ 00:21:56.311 { 00:21:56.311 "name": null, 00:21:56.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.311 "is_configured": false, 00:21:56.311 "data_offset": 2048, 00:21:56.311 "data_size": 63488 00:21:56.311 }, 00:21:56.311 { 00:21:56.311 "name": null, 00:21:56.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.311 "is_configured": false, 00:21:56.311 "data_offset": 2048, 00:21:56.311 "data_size": 63488 00:21:56.311 }, 00:21:56.311 { 00:21:56.311 "name": "BaseBdev3", 00:21:56.311 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:56.311 "is_configured": true, 00:21:56.311 "data_offset": 2048, 00:21:56.311 "data_size": 63488 00:21:56.311 }, 00:21:56.311 { 00:21:56.311 "name": "BaseBdev4", 00:21:56.311 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:56.311 "is_configured": true, 00:21:56.311 "data_offset": 2048, 00:21:56.311 "data_size": 63488 00:21:56.311 } 00:21:56.311 ] 00:21:56.311 }' 00:21:56.311 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:56.311 11:58:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:56.876 11:58:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:57.133 [2024-05-14 11:58:24.098887] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:57.133 [2024-05-14 11:58:24.099047] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:57.133 [2024-05-14 11:58:24.099064] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:57.133 [2024-05-14 11:58:24.099093] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:57.133 [2024-05-14 11:58:24.103097] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb25f0 00:21:57.133 [2024-05-14 11:58:24.105325] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:57.133 11:58:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # sleep 1 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.066 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.325 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:58.325 "name": "raid_bdev1", 00:21:58.325 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:58.325 "strip_size_kb": 0, 00:21:58.325 "state": "online", 00:21:58.325 "raid_level": "raid1", 00:21:58.325 "superblock": true, 00:21:58.325 "num_base_bdevs": 4, 00:21:58.325 "num_base_bdevs_discovered": 3, 00:21:58.325 "num_base_bdevs_operational": 3, 00:21:58.325 "process": { 00:21:58.325 "type": "rebuild", 00:21:58.325 "target": "spare", 00:21:58.325 "progress": { 00:21:58.325 "blocks": 24576, 00:21:58.325 "percent": 38 00:21:58.325 } 00:21:58.325 }, 00:21:58.325 "base_bdevs_list": [ 00:21:58.325 { 00:21:58.325 "name": "spare", 00:21:58.325 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:21:58.325 "is_configured": true, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": null, 00:21:58.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.325 "is_configured": false, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": "BaseBdev3", 00:21:58.325 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:58.325 "is_configured": true, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "name": "BaseBdev4", 00:21:58.325 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:58.325 "is_configured": true, 00:21:58.325 "data_offset": 2048, 00:21:58.325 "data_size": 63488 00:21:58.325 } 00:21:58.325 ] 00:21:58.325 }' 00:21:58.325 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:58.583 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:58.583 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:58.583 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:58.583 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:58.841 [2024-05-14 11:58:25.688601] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:58.841 [2024-05-14 11:58:25.717991] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:58.841 [2024-05-14 11:58:25.718033] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.841 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.099 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:59.099 "name": "raid_bdev1", 00:21:59.099 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:21:59.099 "strip_size_kb": 0, 00:21:59.099 "state": "online", 00:21:59.099 "raid_level": "raid1", 00:21:59.099 "superblock": true, 00:21:59.099 "num_base_bdevs": 4, 00:21:59.099 "num_base_bdevs_discovered": 2, 00:21:59.099 "num_base_bdevs_operational": 2, 00:21:59.099 "base_bdevs_list": [ 00:21:59.099 { 00:21:59.099 "name": null, 00:21:59.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.099 "is_configured": false, 00:21:59.099 "data_offset": 2048, 00:21:59.099 "data_size": 63488 00:21:59.099 }, 00:21:59.099 { 00:21:59.099 "name": null, 00:21:59.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.099 "is_configured": false, 00:21:59.099 "data_offset": 2048, 00:21:59.099 "data_size": 63488 00:21:59.099 }, 00:21:59.099 { 00:21:59.099 "name": "BaseBdev3", 00:21:59.099 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:21:59.099 "is_configured": true, 00:21:59.099 "data_offset": 2048, 00:21:59.099 "data_size": 63488 00:21:59.099 }, 00:21:59.099 { 00:21:59.099 "name": "BaseBdev4", 00:21:59.099 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:21:59.099 "is_configured": true, 00:21:59.099 "data_offset": 2048, 00:21:59.099 "data_size": 63488 00:21:59.099 } 00:21:59.099 ] 00:21:59.099 }' 00:21:59.099 11:58:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:59.099 11:58:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:59.666 11:58:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:59.925 [2024-05-14 11:58:26.804865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:59.925 [2024-05-14 11:58:26.804919] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:59.925 [2024-05-14 11:58:26.804945] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa16ea0 00:21:59.925 [2024-05-14 11:58:26.804958] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:59.925 [2024-05-14 11:58:26.805354] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:59.925 [2024-05-14 11:58:26.805373] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:59.925 [2024-05-14 11:58:26.805469] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:21:59.925 [2024-05-14 11:58:26.805482] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:59.925 [2024-05-14 11:58:26.805493] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:59.925 [2024-05-14 11:58:26.805511] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:59.925 [2024-05-14 11:58:26.809504] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb6c30 00:21:59.925 spare 00:21:59.925 [2024-05-14 11:58:26.811035] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:59.925 11:58:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # sleep 1 00:22:00.859 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:00.859 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:00.859 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:00.859 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:00.860 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:00.860 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.860 11:58:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:01.118 "name": "raid_bdev1", 00:22:01.118 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:01.118 "strip_size_kb": 0, 00:22:01.118 "state": "online", 00:22:01.118 "raid_level": "raid1", 00:22:01.118 "superblock": true, 00:22:01.118 "num_base_bdevs": 4, 00:22:01.118 "num_base_bdevs_discovered": 3, 00:22:01.118 "num_base_bdevs_operational": 3, 00:22:01.118 "process": { 00:22:01.118 "type": "rebuild", 00:22:01.118 "target": "spare", 00:22:01.118 "progress": { 00:22:01.118 "blocks": 24576, 00:22:01.118 "percent": 38 00:22:01.118 } 00:22:01.118 }, 00:22:01.118 "base_bdevs_list": [ 00:22:01.118 { 00:22:01.118 "name": "spare", 00:22:01.118 "uuid": "2cc3ee07-b518-59b6-8d45-e8ee26ddcd2c", 00:22:01.118 "is_configured": true, 00:22:01.118 "data_offset": 2048, 00:22:01.118 "data_size": 63488 00:22:01.118 }, 00:22:01.118 { 00:22:01.118 "name": null, 00:22:01.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.118 "is_configured": false, 00:22:01.118 "data_offset": 2048, 00:22:01.118 "data_size": 63488 00:22:01.118 }, 00:22:01.118 { 00:22:01.118 "name": "BaseBdev3", 00:22:01.118 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:01.118 "is_configured": true, 00:22:01.118 "data_offset": 2048, 00:22:01.118 "data_size": 63488 00:22:01.118 }, 00:22:01.118 { 00:22:01.118 "name": "BaseBdev4", 00:22:01.118 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:01.118 "is_configured": true, 00:22:01.118 "data_offset": 2048, 00:22:01.118 "data_size": 63488 00:22:01.118 } 00:22:01.118 ] 00:22:01.118 }' 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:01.118 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:01.377 [2024-05-14 11:58:28.390432] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:01.377 [2024-05-14 11:58:28.423378] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:01.377 [2024-05-14 11:58:28.423425] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.377 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.635 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:01.635 "name": "raid_bdev1", 00:22:01.635 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:01.635 "strip_size_kb": 0, 00:22:01.635 "state": "online", 00:22:01.635 "raid_level": "raid1", 00:22:01.635 "superblock": true, 00:22:01.635 "num_base_bdevs": 4, 00:22:01.635 "num_base_bdevs_discovered": 2, 00:22:01.635 "num_base_bdevs_operational": 2, 00:22:01.635 "base_bdevs_list": [ 00:22:01.635 { 00:22:01.635 "name": null, 00:22:01.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.635 "is_configured": false, 00:22:01.635 "data_offset": 2048, 00:22:01.635 "data_size": 63488 00:22:01.635 }, 00:22:01.635 { 00:22:01.635 "name": null, 00:22:01.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.635 "is_configured": false, 00:22:01.635 "data_offset": 2048, 00:22:01.635 "data_size": 63488 00:22:01.635 }, 00:22:01.635 { 00:22:01.635 "name": "BaseBdev3", 00:22:01.635 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:01.635 "is_configured": true, 00:22:01.635 "data_offset": 2048, 00:22:01.635 "data_size": 63488 00:22:01.635 }, 00:22:01.635 { 00:22:01.635 "name": "BaseBdev4", 00:22:01.635 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:01.635 "is_configured": true, 00:22:01.635 "data_offset": 2048, 00:22:01.635 "data_size": 63488 00:22:01.635 } 00:22:01.635 ] 00:22:01.635 }' 00:22:01.635 11:58:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:01.635 11:58:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:02.571 "name": "raid_bdev1", 00:22:02.571 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:02.571 "strip_size_kb": 0, 00:22:02.571 "state": "online", 00:22:02.571 "raid_level": "raid1", 00:22:02.571 "superblock": true, 00:22:02.571 "num_base_bdevs": 4, 00:22:02.571 "num_base_bdevs_discovered": 2, 00:22:02.571 "num_base_bdevs_operational": 2, 00:22:02.571 "base_bdevs_list": [ 00:22:02.571 { 00:22:02.571 "name": null, 00:22:02.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.571 "is_configured": false, 00:22:02.571 "data_offset": 2048, 00:22:02.571 "data_size": 63488 00:22:02.571 }, 00:22:02.571 { 00:22:02.571 "name": null, 00:22:02.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.571 "is_configured": false, 00:22:02.571 "data_offset": 2048, 00:22:02.571 "data_size": 63488 00:22:02.571 }, 00:22:02.571 { 00:22:02.571 "name": "BaseBdev3", 00:22:02.571 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:02.571 "is_configured": true, 00:22:02.571 "data_offset": 2048, 00:22:02.571 "data_size": 63488 00:22:02.571 }, 00:22:02.571 { 00:22:02.571 "name": "BaseBdev4", 00:22:02.571 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:02.571 "is_configured": true, 00:22:02.571 "data_offset": 2048, 00:22:02.571 "data_size": 63488 00:22:02.571 } 00:22:02.571 ] 00:22:02.571 }' 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:02.571 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:02.830 11:58:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:03.089 [2024-05-14 11:58:30.112232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:03.089 [2024-05-14 11:58:30.112291] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.089 [2024-05-14 11:58:30.112313] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa169f0 00:22:03.089 [2024-05-14 11:58:30.112326] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.089 [2024-05-14 11:58:30.112694] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.089 [2024-05-14 11:58:30.112712] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:03.089 [2024-05-14 11:58:30.112780] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:03.089 [2024-05-14 11:58:30.112792] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:03.089 [2024-05-14 11:58:30.112803] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:03.089 BaseBdev1 00:22:03.089 11:58:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@786 -- # sleep 1 00:22:04.467 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:04.467 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:04.467 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:04.467 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:04.468 "name": "raid_bdev1", 00:22:04.468 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:04.468 "strip_size_kb": 0, 00:22:04.468 "state": "online", 00:22:04.468 "raid_level": "raid1", 00:22:04.468 "superblock": true, 00:22:04.468 "num_base_bdevs": 4, 00:22:04.468 "num_base_bdevs_discovered": 2, 00:22:04.468 "num_base_bdevs_operational": 2, 00:22:04.468 "base_bdevs_list": [ 00:22:04.468 { 00:22:04.468 "name": null, 00:22:04.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.468 "is_configured": false, 00:22:04.468 "data_offset": 2048, 00:22:04.468 "data_size": 63488 00:22:04.468 }, 00:22:04.468 { 00:22:04.468 "name": null, 00:22:04.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.468 "is_configured": false, 00:22:04.468 "data_offset": 2048, 00:22:04.468 "data_size": 63488 00:22:04.468 }, 00:22:04.468 { 00:22:04.468 "name": "BaseBdev3", 00:22:04.468 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:04.468 "is_configured": true, 00:22:04.468 "data_offset": 2048, 00:22:04.468 "data_size": 63488 00:22:04.468 }, 00:22:04.468 { 00:22:04.468 "name": "BaseBdev4", 00:22:04.468 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:04.468 "is_configured": true, 00:22:04.468 "data_offset": 2048, 00:22:04.468 "data_size": 63488 00:22:04.468 } 00:22:04.468 ] 00:22:04.468 }' 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:04.468 11:58:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:05.036 11:58:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:05.296 "name": "raid_bdev1", 00:22:05.296 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:05.296 "strip_size_kb": 0, 00:22:05.296 "state": "online", 00:22:05.296 "raid_level": "raid1", 00:22:05.296 "superblock": true, 00:22:05.296 "num_base_bdevs": 4, 00:22:05.296 "num_base_bdevs_discovered": 2, 00:22:05.296 "num_base_bdevs_operational": 2, 00:22:05.296 "base_bdevs_list": [ 00:22:05.296 { 00:22:05.296 "name": null, 00:22:05.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.296 "is_configured": false, 00:22:05.296 "data_offset": 2048, 00:22:05.296 "data_size": 63488 00:22:05.296 }, 00:22:05.296 { 00:22:05.296 "name": null, 00:22:05.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.296 "is_configured": false, 00:22:05.296 "data_offset": 2048, 00:22:05.296 "data_size": 63488 00:22:05.296 }, 00:22:05.296 { 00:22:05.296 "name": "BaseBdev3", 00:22:05.296 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:05.296 "is_configured": true, 00:22:05.296 "data_offset": 2048, 00:22:05.296 "data_size": 63488 00:22:05.296 }, 00:22:05.296 { 00:22:05.296 "name": "BaseBdev4", 00:22:05.296 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:05.296 "is_configured": true, 00:22:05.296 "data_offset": 2048, 00:22:05.296 "data_size": 63488 00:22:05.296 } 00:22:05.296 ] 00:22:05.296 }' 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:05.296 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:05.554 [2024-05-14 11:58:32.554799] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:05.554 [2024-05-14 11:58:32.554947] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:05.554 [2024-05-14 11:58:32.554962] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:05.554 request: 00:22:05.554 { 00:22:05.554 "raid_bdev": "raid_bdev1", 00:22:05.554 "base_bdev": "BaseBdev1", 00:22:05.554 "method": "bdev_raid_add_base_bdev", 00:22:05.554 "req_id": 1 00:22:05.554 } 00:22:05.554 Got JSON-RPC error response 00:22:05.554 response: 00:22:05.554 { 00:22:05.554 "code": -22, 00:22:05.554 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:05.554 } 00:22:05.554 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:22:05.554 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:05.554 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:05.554 11:58:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:05.554 11:58:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # sleep 1 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.932 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:06.932 "name": "raid_bdev1", 00:22:06.932 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:06.932 "strip_size_kb": 0, 00:22:06.932 "state": "online", 00:22:06.932 "raid_level": "raid1", 00:22:06.932 "superblock": true, 00:22:06.932 "num_base_bdevs": 4, 00:22:06.932 "num_base_bdevs_discovered": 2, 00:22:06.932 "num_base_bdevs_operational": 2, 00:22:06.932 "base_bdevs_list": [ 00:22:06.933 { 00:22:06.933 "name": null, 00:22:06.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.933 "is_configured": false, 00:22:06.933 "data_offset": 2048, 00:22:06.933 "data_size": 63488 00:22:06.933 }, 00:22:06.933 { 00:22:06.933 "name": null, 00:22:06.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.933 "is_configured": false, 00:22:06.933 "data_offset": 2048, 00:22:06.933 "data_size": 63488 00:22:06.933 }, 00:22:06.933 { 00:22:06.933 "name": "BaseBdev3", 00:22:06.933 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:06.933 "is_configured": true, 00:22:06.933 "data_offset": 2048, 00:22:06.933 "data_size": 63488 00:22:06.933 }, 00:22:06.933 { 00:22:06.933 "name": "BaseBdev4", 00:22:06.933 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:06.933 "is_configured": true, 00:22:06.933 "data_offset": 2048, 00:22:06.933 "data_size": 63488 00:22:06.933 } 00:22:06.933 ] 00:22:06.933 }' 00:22:06.933 11:58:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:06.933 11:58:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.500 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.759 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:07.760 "name": "raid_bdev1", 00:22:07.760 "uuid": "1ce0e189-97eb-41ca-a5f6-7fd7ffb949ec", 00:22:07.760 "strip_size_kb": 0, 00:22:07.760 "state": "online", 00:22:07.760 "raid_level": "raid1", 00:22:07.760 "superblock": true, 00:22:07.760 "num_base_bdevs": 4, 00:22:07.760 "num_base_bdevs_discovered": 2, 00:22:07.760 "num_base_bdevs_operational": 2, 00:22:07.760 "base_bdevs_list": [ 00:22:07.760 { 00:22:07.760 "name": null, 00:22:07.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.760 "is_configured": false, 00:22:07.760 "data_offset": 2048, 00:22:07.760 "data_size": 63488 00:22:07.760 }, 00:22:07.760 { 00:22:07.760 "name": null, 00:22:07.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.760 "is_configured": false, 00:22:07.760 "data_offset": 2048, 00:22:07.760 "data_size": 63488 00:22:07.760 }, 00:22:07.760 { 00:22:07.760 "name": "BaseBdev3", 00:22:07.760 "uuid": "655f6148-7fdf-58d8-9b14-3132db0765a8", 00:22:07.760 "is_configured": true, 00:22:07.760 "data_offset": 2048, 00:22:07.760 "data_size": 63488 00:22:07.760 }, 00:22:07.760 { 00:22:07.760 "name": "BaseBdev4", 00:22:07.760 "uuid": "e48dd4b7-7b1b-59bb-826a-f2a34f24250b", 00:22:07.760 "is_configured": true, 00:22:07.760 "data_offset": 2048, 00:22:07.760 "data_size": 63488 00:22:07.760 } 00:22:07.760 ] 00:22:07.760 }' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # killprocess 1769239 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 1769239 ']' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 1769239 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1769239 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1769239' 00:22:07.760 killing process with pid 1769239 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 1769239 00:22:07.760 Received shutdown signal, test time was about 60.000000 seconds 00:22:07.760 00:22:07.760 Latency(us) 00:22:07.760 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:07.760 =================================================================================================================== 00:22:07.760 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:07.760 [2024-05-14 11:58:34.753097] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:07.760 [2024-05-14 11:58:34.753194] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:07.760 11:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 1769239 00:22:07.760 [2024-05-14 11:58:34.753252] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:07.760 [2024-05-14 11:58:34.753270] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb1910 name raid_bdev1, state offline 00:22:07.760 [2024-05-14 11:58:34.801215] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:08.019 11:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@797 -- # return 0 00:22:08.019 00:22:08.019 real 0m39.514s 00:22:08.019 user 0m57.585s 00:22:08.019 sys 0m7.284s 00:22:08.019 11:58:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:08.019 11:58:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:08.019 ************************************ 00:22:08.019 END TEST raid_rebuild_test_sb 00:22:08.019 ************************************ 00:22:08.019 11:58:35 bdev_raid -- bdev/bdev_raid.sh@825 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:22:08.019 11:58:35 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:22:08.019 11:58:35 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:08.019 11:58:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:08.277 ************************************ 00:22:08.277 START TEST raid_rebuild_test_io 00:22:08.277 ************************************ 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false true true 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local superblock=false 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # '[' false = true ']' 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # raid_pid=1774854 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 1774854 /var/tmp/spdk-raid.sock 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 1774854 ']' 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:08.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:08.277 11:58:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:08.277 [2024-05-14 11:58:35.187804] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:22:08.277 [2024-05-14 11:58:35.187868] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1774854 ] 00:22:08.277 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:08.277 Zero copy mechanism will not be used. 00:22:08.277 [2024-05-14 11:58:35.318116] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:08.535 [2024-05-14 11:58:35.425726] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:08.535 [2024-05-14 11:58:35.484878] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:08.535 [2024-05-14 11:58:35.484908] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:09.102 11:58:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:09.102 11:58:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:22:09.102 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:09.102 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:09.361 BaseBdev1_malloc 00:22:09.361 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:09.620 [2024-05-14 11:58:36.558741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:09.620 [2024-05-14 11:58:36.558788] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.620 [2024-05-14 11:58:36.558812] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e46960 00:22:09.620 [2024-05-14 11:58:36.558825] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.620 [2024-05-14 11:58:36.560576] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.620 [2024-05-14 11:58:36.560604] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:09.620 BaseBdev1 00:22:09.620 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:09.620 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:09.878 BaseBdev2_malloc 00:22:09.878 11:58:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:10.136 [2024-05-14 11:58:37.053831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:10.136 [2024-05-14 11:58:37.053875] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.136 [2024-05-14 11:58:37.053897] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff9b40 00:22:10.136 [2024-05-14 11:58:37.053910] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.136 [2024-05-14 11:58:37.055489] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.136 [2024-05-14 11:58:37.055516] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:10.136 BaseBdev2 00:22:10.136 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:10.136 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:10.395 BaseBdev3_malloc 00:22:10.395 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:10.653 [2024-05-14 11:58:37.532923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:10.653 [2024-05-14 11:58:37.532966] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.653 [2024-05-14 11:58:37.532988] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e404c0 00:22:10.653 [2024-05-14 11:58:37.533000] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.653 [2024-05-14 11:58:37.534507] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.653 [2024-05-14 11:58:37.534534] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:10.653 BaseBdev3 00:22:10.653 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:10.653 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:10.912 BaseBdev4_malloc 00:22:10.912 11:58:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:11.170 [2024-05-14 11:58:38.022799] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:11.170 [2024-05-14 11:58:38.022843] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.170 [2024-05-14 11:58:38.022861] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e42ec0 00:22:11.170 [2024-05-14 11:58:38.022874] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.170 [2024-05-14 11:58:38.024421] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.170 [2024-05-14 11:58:38.024447] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:11.170 BaseBdev4 00:22:11.170 11:58:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:11.428 spare_malloc 00:22:11.429 11:58:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:11.687 spare_delay 00:22:11.687 11:58:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:11.687 [2024-05-14 11:58:38.757308] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:11.687 [2024-05-14 11:58:38.757355] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.687 [2024-05-14 11:58:38.757373] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e43ec0 00:22:11.687 [2024-05-14 11:58:38.757386] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.687 [2024-05-14 11:58:38.758995] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.687 [2024-05-14 11:58:38.759023] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:11.687 spare 00:22:11.945 11:58:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:11.945 [2024-05-14 11:58:38.997964] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:11.945 [2024-05-14 11:58:38.999331] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:11.945 [2024-05-14 11:58:38.999389] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:11.945 [2024-05-14 11:58:38.999450] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:11.945 [2024-05-14 11:58:38.999540] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e459a0 00:22:11.945 [2024-05-14 11:58:38.999551] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:11.945 [2024-05-14 11:58:38.999772] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e42990 00:22:11.945 [2024-05-14 11:58:38.999927] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e459a0 00:22:11.945 [2024-05-14 11:58:38.999937] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e459a0 00:22:11.945 [2024-05-14 11:58:39.000051] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.945 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.203 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:12.203 "name": "raid_bdev1", 00:22:12.203 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:12.203 "strip_size_kb": 0, 00:22:12.203 "state": "online", 00:22:12.203 "raid_level": "raid1", 00:22:12.203 "superblock": false, 00:22:12.203 "num_base_bdevs": 4, 00:22:12.203 "num_base_bdevs_discovered": 4, 00:22:12.203 "num_base_bdevs_operational": 4, 00:22:12.203 "base_bdevs_list": [ 00:22:12.203 { 00:22:12.203 "name": "BaseBdev1", 00:22:12.203 "uuid": "657f2306-8e44-5e1f-9ca2-dd489418e1b3", 00:22:12.203 "is_configured": true, 00:22:12.203 "data_offset": 0, 00:22:12.203 "data_size": 65536 00:22:12.203 }, 00:22:12.203 { 00:22:12.203 "name": "BaseBdev2", 00:22:12.203 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:12.203 "is_configured": true, 00:22:12.203 "data_offset": 0, 00:22:12.203 "data_size": 65536 00:22:12.203 }, 00:22:12.203 { 00:22:12.203 "name": "BaseBdev3", 00:22:12.203 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:12.203 "is_configured": true, 00:22:12.203 "data_offset": 0, 00:22:12.203 "data_size": 65536 00:22:12.203 }, 00:22:12.203 { 00:22:12.203 "name": "BaseBdev4", 00:22:12.203 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:12.203 "is_configured": true, 00:22:12.203 "data_offset": 0, 00:22:12.203 "data_size": 65536 00:22:12.203 } 00:22:12.203 ] 00:22:12.203 }' 00:22:12.203 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:12.203 11:58:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:13.179 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:13.179 11:58:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:22:13.179 [2024-05-14 11:58:40.089173] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:13.179 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=65536 00:22:13.179 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.179 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:13.439 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # data_offset=0 00:22:13.439 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:22:13.439 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:13.439 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:13.439 [2024-05-14 11:58:40.456209] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe2e00 00:22:13.439 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:13.439 Zero copy mechanism will not be used. 00:22:13.439 Running I/O for 60 seconds... 00:22:13.698 [2024-05-14 11:58:40.580393] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:13.698 [2024-05-14 11:58:40.588597] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fe2e00 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.698 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.956 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:13.956 "name": "raid_bdev1", 00:22:13.956 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:13.956 "strip_size_kb": 0, 00:22:13.956 "state": "online", 00:22:13.956 "raid_level": "raid1", 00:22:13.956 "superblock": false, 00:22:13.956 "num_base_bdevs": 4, 00:22:13.956 "num_base_bdevs_discovered": 3, 00:22:13.957 "num_base_bdevs_operational": 3, 00:22:13.957 "base_bdevs_list": [ 00:22:13.957 { 00:22:13.957 "name": null, 00:22:13.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.957 "is_configured": false, 00:22:13.957 "data_offset": 0, 00:22:13.957 "data_size": 65536 00:22:13.957 }, 00:22:13.957 { 00:22:13.957 "name": "BaseBdev2", 00:22:13.957 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:13.957 "is_configured": true, 00:22:13.957 "data_offset": 0, 00:22:13.957 "data_size": 65536 00:22:13.957 }, 00:22:13.957 { 00:22:13.957 "name": "BaseBdev3", 00:22:13.957 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:13.957 "is_configured": true, 00:22:13.957 "data_offset": 0, 00:22:13.957 "data_size": 65536 00:22:13.957 }, 00:22:13.957 { 00:22:13.957 "name": "BaseBdev4", 00:22:13.957 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:13.957 "is_configured": true, 00:22:13.957 "data_offset": 0, 00:22:13.957 "data_size": 65536 00:22:13.957 } 00:22:13.957 ] 00:22:13.957 }' 00:22:13.957 11:58:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:13.957 11:58:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:14.522 11:58:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:14.781 [2024-05-14 11:58:41.749033] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:14.781 [2024-05-14 11:58:41.795214] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4e090 00:22:14.781 [2024-05-14 11:58:41.797617] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:14.781 11:58:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:22:15.039 [2024-05-14 11:58:41.909280] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:15.039 [2024-05-14 11:58:41.909776] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:15.039 [2024-05-14 11:58:42.041334] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:15.298 [2024-05-14 11:58:42.352538] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:15.298 [2024-05-14 11:58:42.352975] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:15.556 [2024-05-14 11:58:42.556249] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:15.556 [2024-05-14 11:58:42.556893] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.814 11:58:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.072 [2024-05-14 11:58:43.013848] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:16.072 [2024-05-14 11:58:43.014429] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:16.072 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:16.072 "name": "raid_bdev1", 00:22:16.072 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:16.072 "strip_size_kb": 0, 00:22:16.072 "state": "online", 00:22:16.072 "raid_level": "raid1", 00:22:16.072 "superblock": false, 00:22:16.072 "num_base_bdevs": 4, 00:22:16.072 "num_base_bdevs_discovered": 4, 00:22:16.072 "num_base_bdevs_operational": 4, 00:22:16.072 "process": { 00:22:16.072 "type": "rebuild", 00:22:16.072 "target": "spare", 00:22:16.072 "progress": { 00:22:16.072 "blocks": 16384, 00:22:16.072 "percent": 25 00:22:16.072 } 00:22:16.072 }, 00:22:16.072 "base_bdevs_list": [ 00:22:16.072 { 00:22:16.072 "name": "spare", 00:22:16.072 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:16.072 "is_configured": true, 00:22:16.072 "data_offset": 0, 00:22:16.072 "data_size": 65536 00:22:16.072 }, 00:22:16.072 { 00:22:16.072 "name": "BaseBdev2", 00:22:16.073 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:16.073 "is_configured": true, 00:22:16.073 "data_offset": 0, 00:22:16.073 "data_size": 65536 00:22:16.073 }, 00:22:16.073 { 00:22:16.073 "name": "BaseBdev3", 00:22:16.073 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:16.073 "is_configured": true, 00:22:16.073 "data_offset": 0, 00:22:16.073 "data_size": 65536 00:22:16.073 }, 00:22:16.073 { 00:22:16.073 "name": "BaseBdev4", 00:22:16.073 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:16.073 "is_configured": true, 00:22:16.073 "data_offset": 0, 00:22:16.073 "data_size": 65536 00:22:16.073 } 00:22:16.073 ] 00:22:16.073 }' 00:22:16.073 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:16.073 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:16.073 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:16.073 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:16.073 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:16.331 [2024-05-14 11:58:43.352534] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:16.331 [2024-05-14 11:58:43.372616] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:16.589 [2024-05-14 11:58:43.463629] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:16.590 [2024-05-14 11:58:43.563906] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:16.590 [2024-05-14 11:58:43.576102] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:16.590 [2024-05-14 11:58:43.590924] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fe2e00 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.590 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.848 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:16.848 "name": "raid_bdev1", 00:22:16.848 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:16.848 "strip_size_kb": 0, 00:22:16.848 "state": "online", 00:22:16.848 "raid_level": "raid1", 00:22:16.848 "superblock": false, 00:22:16.848 "num_base_bdevs": 4, 00:22:16.848 "num_base_bdevs_discovered": 3, 00:22:16.848 "num_base_bdevs_operational": 3, 00:22:16.848 "base_bdevs_list": [ 00:22:16.848 { 00:22:16.848 "name": null, 00:22:16.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.848 "is_configured": false, 00:22:16.848 "data_offset": 0, 00:22:16.848 "data_size": 65536 00:22:16.848 }, 00:22:16.848 { 00:22:16.848 "name": "BaseBdev2", 00:22:16.848 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:16.848 "is_configured": true, 00:22:16.848 "data_offset": 0, 00:22:16.848 "data_size": 65536 00:22:16.848 }, 00:22:16.848 { 00:22:16.848 "name": "BaseBdev3", 00:22:16.848 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:16.848 "is_configured": true, 00:22:16.848 "data_offset": 0, 00:22:16.848 "data_size": 65536 00:22:16.848 }, 00:22:16.848 { 00:22:16.848 "name": "BaseBdev4", 00:22:16.848 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:16.848 "is_configured": true, 00:22:16.848 "data_offset": 0, 00:22:16.848 "data_size": 65536 00:22:16.848 } 00:22:16.848 ] 00:22:16.848 }' 00:22:16.848 11:58:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:16.848 11:58:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:17.783 "name": "raid_bdev1", 00:22:17.783 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:17.783 "strip_size_kb": 0, 00:22:17.783 "state": "online", 00:22:17.783 "raid_level": "raid1", 00:22:17.783 "superblock": false, 00:22:17.783 "num_base_bdevs": 4, 00:22:17.783 "num_base_bdevs_discovered": 3, 00:22:17.783 "num_base_bdevs_operational": 3, 00:22:17.783 "base_bdevs_list": [ 00:22:17.783 { 00:22:17.783 "name": null, 00:22:17.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.783 "is_configured": false, 00:22:17.783 "data_offset": 0, 00:22:17.783 "data_size": 65536 00:22:17.783 }, 00:22:17.783 { 00:22:17.783 "name": "BaseBdev2", 00:22:17.783 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:17.783 "is_configured": true, 00:22:17.783 "data_offset": 0, 00:22:17.783 "data_size": 65536 00:22:17.783 }, 00:22:17.783 { 00:22:17.783 "name": "BaseBdev3", 00:22:17.783 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:17.783 "is_configured": true, 00:22:17.783 "data_offset": 0, 00:22:17.783 "data_size": 65536 00:22:17.783 }, 00:22:17.783 { 00:22:17.783 "name": "BaseBdev4", 00:22:17.783 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:17.783 "is_configured": true, 00:22:17.783 "data_offset": 0, 00:22:17.783 "data_size": 65536 00:22:17.783 } 00:22:17.783 ] 00:22:17.783 }' 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:17.783 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:18.041 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:18.041 11:58:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:18.041 [2024-05-14 11:58:45.095935] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:18.300 11:58:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:22:18.300 [2024-05-14 11:58:45.134360] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe48d0 00:22:18.300 [2024-05-14 11:58:45.135922] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:18.300 [2024-05-14 11:58:45.248097] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:18.300 [2024-05-14 11:58:45.248599] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:18.558 [2024-05-14 11:58:45.399282] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:18.816 [2024-05-14 11:58:45.902126] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:18.816 [2024-05-14 11:58:45.902306] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.075 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.333 [2024-05-14 11:58:46.166355] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:19.333 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:19.333 "name": "raid_bdev1", 00:22:19.333 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:19.333 "strip_size_kb": 0, 00:22:19.333 "state": "online", 00:22:19.333 "raid_level": "raid1", 00:22:19.333 "superblock": false, 00:22:19.333 "num_base_bdevs": 4, 00:22:19.333 "num_base_bdevs_discovered": 4, 00:22:19.333 "num_base_bdevs_operational": 4, 00:22:19.333 "process": { 00:22:19.333 "type": "rebuild", 00:22:19.333 "target": "spare", 00:22:19.333 "progress": { 00:22:19.333 "blocks": 14336, 00:22:19.333 "percent": 21 00:22:19.333 } 00:22:19.333 }, 00:22:19.333 "base_bdevs_list": [ 00:22:19.334 { 00:22:19.334 "name": "spare", 00:22:19.334 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:19.334 "is_configured": true, 00:22:19.334 "data_offset": 0, 00:22:19.334 "data_size": 65536 00:22:19.334 }, 00:22:19.334 { 00:22:19.334 "name": "BaseBdev2", 00:22:19.334 "uuid": "c6e9e276-8fc4-520b-9d54-66104f34e753", 00:22:19.334 "is_configured": true, 00:22:19.334 "data_offset": 0, 00:22:19.334 "data_size": 65536 00:22:19.334 }, 00:22:19.334 { 00:22:19.334 "name": "BaseBdev3", 00:22:19.334 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:19.334 "is_configured": true, 00:22:19.334 "data_offset": 0, 00:22:19.334 "data_size": 65536 00:22:19.334 }, 00:22:19.334 { 00:22:19.334 "name": "BaseBdev4", 00:22:19.334 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:19.334 "is_configured": true, 00:22:19.334 "data_offset": 0, 00:22:19.334 "data_size": 65536 00:22:19.334 } 00:22:19.334 ] 00:22:19.334 }' 00:22:19.334 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:19.592 [2024-05-14 11:58:46.428027] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # '[' false = true ']' 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:22:19.592 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:19.592 [2024-05-14 11:58:46.671388] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:19.850 [2024-05-14 11:58:46.703191] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:19.850 [2024-05-14 11:58:46.824030] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:19.850 [2024-05-14 11:58:46.932304] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1fe2e00 00:22:19.850 [2024-05-14 11:58:46.932334] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1fe48d0 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.109 11:58:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:20.367 "name": "raid_bdev1", 00:22:20.367 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:20.367 "strip_size_kb": 0, 00:22:20.367 "state": "online", 00:22:20.367 "raid_level": "raid1", 00:22:20.367 "superblock": false, 00:22:20.367 "num_base_bdevs": 4, 00:22:20.367 "num_base_bdevs_discovered": 3, 00:22:20.367 "num_base_bdevs_operational": 3, 00:22:20.367 "process": { 00:22:20.367 "type": "rebuild", 00:22:20.367 "target": "spare", 00:22:20.367 "progress": { 00:22:20.367 "blocks": 24576, 00:22:20.367 "percent": 37 00:22:20.367 } 00:22:20.367 }, 00:22:20.367 "base_bdevs_list": [ 00:22:20.367 { 00:22:20.367 "name": "spare", 00:22:20.367 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:20.367 "is_configured": true, 00:22:20.367 "data_offset": 0, 00:22:20.367 "data_size": 65536 00:22:20.367 }, 00:22:20.367 { 00:22:20.367 "name": null, 00:22:20.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.367 "is_configured": false, 00:22:20.367 "data_offset": 0, 00:22:20.367 "data_size": 65536 00:22:20.367 }, 00:22:20.367 { 00:22:20.367 "name": "BaseBdev3", 00:22:20.367 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:20.367 "is_configured": true, 00:22:20.367 "data_offset": 0, 00:22:20.367 "data_size": 65536 00:22:20.367 }, 00:22:20.367 { 00:22:20.367 "name": "BaseBdev4", 00:22:20.367 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:20.367 "is_configured": true, 00:22:20.367 "data_offset": 0, 00:22:20.367 "data_size": 65536 00:22:20.367 } 00:22:20.367 ] 00:22:20.367 }' 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # local timeout=783 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.367 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:20.626 "name": "raid_bdev1", 00:22:20.626 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:20.626 "strip_size_kb": 0, 00:22:20.626 "state": "online", 00:22:20.626 "raid_level": "raid1", 00:22:20.626 "superblock": false, 00:22:20.626 "num_base_bdevs": 4, 00:22:20.626 "num_base_bdevs_discovered": 3, 00:22:20.626 "num_base_bdevs_operational": 3, 00:22:20.626 "process": { 00:22:20.626 "type": "rebuild", 00:22:20.626 "target": "spare", 00:22:20.626 "progress": { 00:22:20.626 "blocks": 30720, 00:22:20.626 "percent": 46 00:22:20.626 } 00:22:20.626 }, 00:22:20.626 "base_bdevs_list": [ 00:22:20.626 { 00:22:20.626 "name": "spare", 00:22:20.626 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:20.626 "is_configured": true, 00:22:20.626 "data_offset": 0, 00:22:20.626 "data_size": 65536 00:22:20.626 }, 00:22:20.626 { 00:22:20.626 "name": null, 00:22:20.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.626 "is_configured": false, 00:22:20.626 "data_offset": 0, 00:22:20.626 "data_size": 65536 00:22:20.626 }, 00:22:20.626 { 00:22:20.626 "name": "BaseBdev3", 00:22:20.626 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:20.626 "is_configured": true, 00:22:20.626 "data_offset": 0, 00:22:20.626 "data_size": 65536 00:22:20.626 }, 00:22:20.626 { 00:22:20.626 "name": "BaseBdev4", 00:22:20.626 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:20.626 "is_configured": true, 00:22:20.626 "data_offset": 0, 00:22:20.626 "data_size": 65536 00:22:20.626 } 00:22:20.626 ] 00:22:20.626 }' 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:20.626 [2024-05-14 11:58:47.662279] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:20.626 [2024-05-14 11:58:47.662570] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:20.626 11:58:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:21.192 [2024-05-14 11:58:48.054242] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:21.450 [2024-05-14 11:58:48.313493] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:22:21.709 [2024-05-14 11:58:48.649118] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.709 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.967 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:21.967 "name": "raid_bdev1", 00:22:21.967 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:21.967 "strip_size_kb": 0, 00:22:21.967 "state": "online", 00:22:21.967 "raid_level": "raid1", 00:22:21.967 "superblock": false, 00:22:21.967 "num_base_bdevs": 4, 00:22:21.967 "num_base_bdevs_discovered": 3, 00:22:21.967 "num_base_bdevs_operational": 3, 00:22:21.967 "process": { 00:22:21.967 "type": "rebuild", 00:22:21.967 "target": "spare", 00:22:21.967 "progress": { 00:22:21.967 "blocks": 55296, 00:22:21.967 "percent": 84 00:22:21.967 } 00:22:21.967 }, 00:22:21.967 "base_bdevs_list": [ 00:22:21.967 { 00:22:21.967 "name": "spare", 00:22:21.967 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:21.967 "is_configured": true, 00:22:21.967 "data_offset": 0, 00:22:21.967 "data_size": 65536 00:22:21.967 }, 00:22:21.967 { 00:22:21.967 "name": null, 00:22:21.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.967 "is_configured": false, 00:22:21.967 "data_offset": 0, 00:22:21.967 "data_size": 65536 00:22:21.967 }, 00:22:21.967 { 00:22:21.967 "name": "BaseBdev3", 00:22:21.967 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:21.967 "is_configured": true, 00:22:21.967 "data_offset": 0, 00:22:21.967 "data_size": 65536 00:22:21.967 }, 00:22:21.967 { 00:22:21.967 "name": "BaseBdev4", 00:22:21.967 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:21.967 "is_configured": true, 00:22:21.967 "data_offset": 0, 00:22:21.967 "data_size": 65536 00:22:21.967 } 00:22:21.967 ] 00:22:21.967 }' 00:22:21.967 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:21.967 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:21.967 11:58:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:21.967 [2024-05-14 11:58:48.990200] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:21.967 11:58:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:21.967 11:58:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:22.534 [2024-05-14 11:58:49.528911] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:22.792 [2024-05-14 11:58:49.629220] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:22.792 [2024-05-14 11:58:49.631709] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.050 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:23.308 "name": "raid_bdev1", 00:22:23.308 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:23.308 "strip_size_kb": 0, 00:22:23.308 "state": "online", 00:22:23.308 "raid_level": "raid1", 00:22:23.308 "superblock": false, 00:22:23.308 "num_base_bdevs": 4, 00:22:23.308 "num_base_bdevs_discovered": 3, 00:22:23.308 "num_base_bdevs_operational": 3, 00:22:23.308 "base_bdevs_list": [ 00:22:23.308 { 00:22:23.308 "name": "spare", 00:22:23.308 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:23.308 "is_configured": true, 00:22:23.308 "data_offset": 0, 00:22:23.308 "data_size": 65536 00:22:23.308 }, 00:22:23.308 { 00:22:23.308 "name": null, 00:22:23.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.308 "is_configured": false, 00:22:23.308 "data_offset": 0, 00:22:23.308 "data_size": 65536 00:22:23.308 }, 00:22:23.308 { 00:22:23.308 "name": "BaseBdev3", 00:22:23.308 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:23.308 "is_configured": true, 00:22:23.308 "data_offset": 0, 00:22:23.308 "data_size": 65536 00:22:23.308 }, 00:22:23.308 { 00:22:23.308 "name": "BaseBdev4", 00:22:23.308 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:23.308 "is_configured": true, 00:22:23.308 "data_offset": 0, 00:22:23.308 "data_size": 65536 00:22:23.308 } 00:22:23.308 ] 00:22:23.308 }' 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # break 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.308 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.566 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:23.566 "name": "raid_bdev1", 00:22:23.566 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:23.566 "strip_size_kb": 0, 00:22:23.566 "state": "online", 00:22:23.566 "raid_level": "raid1", 00:22:23.566 "superblock": false, 00:22:23.566 "num_base_bdevs": 4, 00:22:23.566 "num_base_bdevs_discovered": 3, 00:22:23.566 "num_base_bdevs_operational": 3, 00:22:23.566 "base_bdevs_list": [ 00:22:23.566 { 00:22:23.566 "name": "spare", 00:22:23.567 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:23.567 "is_configured": true, 00:22:23.567 "data_offset": 0, 00:22:23.567 "data_size": 65536 00:22:23.567 }, 00:22:23.567 { 00:22:23.567 "name": null, 00:22:23.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.567 "is_configured": false, 00:22:23.567 "data_offset": 0, 00:22:23.567 "data_size": 65536 00:22:23.567 }, 00:22:23.567 { 00:22:23.567 "name": "BaseBdev3", 00:22:23.567 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:23.567 "is_configured": true, 00:22:23.567 "data_offset": 0, 00:22:23.567 "data_size": 65536 00:22:23.567 }, 00:22:23.567 { 00:22:23.567 "name": "BaseBdev4", 00:22:23.567 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:23.567 "is_configured": true, 00:22:23.567 "data_offset": 0, 00:22:23.567 "data_size": 65536 00:22:23.567 } 00:22:23.567 ] 00:22:23.567 }' 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.567 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.825 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:23.825 "name": "raid_bdev1", 00:22:23.825 "uuid": "f39ade03-a47e-4b6a-bdf2-9f21a1568c68", 00:22:23.825 "strip_size_kb": 0, 00:22:23.825 "state": "online", 00:22:23.825 "raid_level": "raid1", 00:22:23.825 "superblock": false, 00:22:23.825 "num_base_bdevs": 4, 00:22:23.825 "num_base_bdevs_discovered": 3, 00:22:23.825 "num_base_bdevs_operational": 3, 00:22:23.825 "base_bdevs_list": [ 00:22:23.825 { 00:22:23.825 "name": "spare", 00:22:23.825 "uuid": "e91c28d7-b94c-50ee-ae8b-bc1137b9cbdb", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 0, 00:22:23.825 "data_size": 65536 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": null, 00:22:23.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.825 "is_configured": false, 00:22:23.825 "data_offset": 0, 00:22:23.825 "data_size": 65536 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": "BaseBdev3", 00:22:23.825 "uuid": "533289da-3f90-5871-9a39-80a50b451094", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 0, 00:22:23.825 "data_size": 65536 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": "BaseBdev4", 00:22:23.825 "uuid": "6854e87e-264c-56d6-879d-5f5d4fb826a6", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 0, 00:22:23.825 "data_size": 65536 00:22:23.825 } 00:22:23.825 ] 00:22:23.825 }' 00:22:23.825 11:58:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:23.825 11:58:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:24.758 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:24.758 [2024-05-14 11:58:51.720138] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:24.758 [2024-05-14 11:58:51.720166] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:24.758 00:22:24.758 Latency(us) 00:22:24.758 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:24.758 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:24.758 raid_bdev1 : 11.25 95.59 286.77 0.00 0.00 14270.53 302.75 121270.09 00:22:24.759 =================================================================================================================== 00:22:24.759 Total : 95.59 286.77 0.00 0.00 14270.53 302.75 121270.09 00:22:24.759 [2024-05-14 11:58:51.736216] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.759 [2024-05-14 11:58:51.736244] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:24.759 [2024-05-14 11:58:51.736340] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:24.759 [2024-05-14 11:58:51.736353] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e459a0 name raid_bdev1, state offline 00:22:24.759 0 00:22:24.759 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.759 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # jq length 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:25.017 11:58:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:25.017 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:25.276 /dev/nbd0 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:25.276 1+0 records in 00:22:25.276 1+0 records out 00:22:25.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275212 s, 14.9 MB/s 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z '' ']' 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # continue 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev3 ']' 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:25.276 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:22:25.534 /dev/nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:25.534 1+0 records in 00:22:25.534 1+0 records out 00:22:25.534 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262774 s, 15.6 MB/s 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:25.534 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev4 ']' 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:25.792 11:58:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:22:26.051 /dev/nbd1 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:26.051 1+0 records in 00:22:26.051 1+0 records out 00:22:26.051 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250521 s, 16.3 MB/s 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:26.051 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@736 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:26.310 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:26.569 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@748 -- # '[' false = true ']' 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@795 -- # killprocess 1774854 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 1774854 ']' 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 1774854 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1774854 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1774854' 00:22:26.828 killing process with pid 1774854 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 1774854 00:22:26.828 Received shutdown signal, test time was about 13.212853 seconds 00:22:26.828 00:22:26.828 Latency(us) 00:22:26.828 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:26.828 =================================================================================================================== 00:22:26.828 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:26.828 [2024-05-14 11:58:53.703781] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:26.828 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 1774854 00:22:26.828 [2024-05-14 11:58:53.746234] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:27.087 11:58:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@797 -- # return 0 00:22:27.087 00:22:27.087 real 0m18.835s 00:22:27.087 user 0m29.209s 00:22:27.087 sys 0m3.339s 00:22:27.087 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:27.087 11:58:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:27.087 ************************************ 00:22:27.087 END TEST raid_rebuild_test_io 00:22:27.087 ************************************ 00:22:27.087 11:58:53 bdev_raid -- bdev/bdev_raid.sh@826 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:22:27.087 11:58:53 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:22:27.087 11:58:53 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:27.087 11:58:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:27.087 ************************************ 00:22:27.087 START TEST raid_rebuild_test_sb_io 00:22:27.087 ************************************ 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true true true 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=4 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local background_io=true 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local verify=true 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev3 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # echo BaseBdev4 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # local strip_size 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@582 -- # local create_arg 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local data_offset 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # raid_pid=1777444 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # waitforlisten 1777444 /var/tmp/spdk-raid.sock 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 1777444 ']' 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:27.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:27.087 11:58:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:27.087 [2024-05-14 11:58:54.113615] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:22:27.087 [2024-05-14 11:58:54.113680] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1777444 ] 00:22:27.087 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:27.087 Zero copy mechanism will not be used. 00:22:27.346 [2024-05-14 11:58:54.231656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:27.346 [2024-05-14 11:58:54.334235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:27.346 [2024-05-14 11:58:54.394586] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:27.346 [2024-05-14 11:58:54.394625] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:28.282 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:28.282 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:22:28.282 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:28.282 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:28.282 BaseBdev1_malloc 00:22:28.282 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:28.541 [2024-05-14 11:58:55.506972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:28.541 [2024-05-14 11:58:55.507019] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.541 [2024-05-14 11:58:55.507041] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22da960 00:22:28.541 [2024-05-14 11:58:55.507054] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.541 [2024-05-14 11:58:55.508929] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.541 [2024-05-14 11:58:55.508959] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:28.541 BaseBdev1 00:22:28.541 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:28.541 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:28.801 BaseBdev2_malloc 00:22:28.801 11:58:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:29.060 [2024-05-14 11:58:55.990088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:29.060 [2024-05-14 11:58:55.990130] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.060 [2024-05-14 11:58:55.990152] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x248db40 00:22:29.060 [2024-05-14 11:58:55.990171] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.060 [2024-05-14 11:58:55.991686] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.060 [2024-05-14 11:58:55.991715] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:29.060 BaseBdev2 00:22:29.060 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:29.060 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:29.319 BaseBdev3_malloc 00:22:29.319 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:29.580 [2024-05-14 11:58:56.467965] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:29.580 [2024-05-14 11:58:56.468008] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.580 [2024-05-14 11:58:56.468030] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d44c0 00:22:29.580 [2024-05-14 11:58:56.468043] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.580 [2024-05-14 11:58:56.469571] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.580 [2024-05-14 11:58:56.469599] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:29.580 BaseBdev3 00:22:29.580 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:22:29.580 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:29.879 BaseBdev4_malloc 00:22:29.879 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:29.879 [2024-05-14 11:58:56.958087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:29.879 [2024-05-14 11:58:56.958134] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.879 [2024-05-14 11:58:56.958155] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d6ec0 00:22:29.879 [2024-05-14 11:58:56.958167] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.136 [2024-05-14 11:58:56.959766] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.136 [2024-05-14 11:58:56.959793] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:30.136 BaseBdev4 00:22:30.136 11:58:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:30.136 spare_malloc 00:22:30.394 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:30.394 spare_delay 00:22:30.394 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:30.652 [2024-05-14 11:58:57.673810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:30.652 [2024-05-14 11:58:57.673851] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.652 [2024-05-14 11:58:57.673871] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d7ec0 00:22:30.652 [2024-05-14 11:58:57.673883] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.652 [2024-05-14 11:58:57.675447] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.652 [2024-05-14 11:58:57.675474] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:30.652 spare 00:22:30.652 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:30.910 [2024-05-14 11:58:57.918506] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:30.910 [2024-05-14 11:58:57.919842] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:30.910 [2024-05-14 11:58:57.919898] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:30.910 [2024-05-14 11:58:57.919945] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:30.910 [2024-05-14 11:58:57.920139] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d99a0 00:22:30.910 [2024-05-14 11:58:57.920151] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:30.910 [2024-05-14 11:58:57.920349] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d6990 00:22:30.910 [2024-05-14 11:58:57.920508] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d99a0 00:22:30.910 [2024-05-14 11:58:57.920519] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22d99a0 00:22:30.910 [2024-05-14 11:58:57.920619] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.910 11:58:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.168 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:31.168 "name": "raid_bdev1", 00:22:31.168 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:31.168 "strip_size_kb": 0, 00:22:31.168 "state": "online", 00:22:31.168 "raid_level": "raid1", 00:22:31.168 "superblock": true, 00:22:31.168 "num_base_bdevs": 4, 00:22:31.168 "num_base_bdevs_discovered": 4, 00:22:31.168 "num_base_bdevs_operational": 4, 00:22:31.168 "base_bdevs_list": [ 00:22:31.168 { 00:22:31.168 "name": "BaseBdev1", 00:22:31.168 "uuid": "7d337618-7e80-5156-8af6-72d741f61cb5", 00:22:31.168 "is_configured": true, 00:22:31.168 "data_offset": 2048, 00:22:31.168 "data_size": 63488 00:22:31.168 }, 00:22:31.168 { 00:22:31.168 "name": "BaseBdev2", 00:22:31.168 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:31.168 "is_configured": true, 00:22:31.168 "data_offset": 2048, 00:22:31.168 "data_size": 63488 00:22:31.168 }, 00:22:31.168 { 00:22:31.168 "name": "BaseBdev3", 00:22:31.168 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:31.168 "is_configured": true, 00:22:31.168 "data_offset": 2048, 00:22:31.168 "data_size": 63488 00:22:31.168 }, 00:22:31.168 { 00:22:31.168 "name": "BaseBdev4", 00:22:31.168 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:31.168 "is_configured": true, 00:22:31.168 "data_offset": 2048, 00:22:31.168 "data_size": 63488 00:22:31.168 } 00:22:31.168 ] 00:22:31.168 }' 00:22:31.168 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:31.168 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:31.734 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:31.734 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:22:31.991 [2024-05-14 11:58:58.901340] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:31.991 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=63488 00:22:31.991 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.991 11:58:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:32.249 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # data_offset=2048 00:22:32.249 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@626 -- # '[' true = true ']' 00:22:32.249 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:32.249 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:32.249 [2024-05-14 11:58:59.272147] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d17a0 00:22:32.249 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:32.249 Zero copy mechanism will not be used. 00:22:32.249 Running I/O for 60 seconds... 00:22:32.506 [2024-05-14 11:58:59.389039] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:32.506 [2024-05-14 11:58:59.397194] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22d17a0 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.506 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.764 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:32.764 "name": "raid_bdev1", 00:22:32.764 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:32.764 "strip_size_kb": 0, 00:22:32.764 "state": "online", 00:22:32.764 "raid_level": "raid1", 00:22:32.764 "superblock": true, 00:22:32.764 "num_base_bdevs": 4, 00:22:32.764 "num_base_bdevs_discovered": 3, 00:22:32.764 "num_base_bdevs_operational": 3, 00:22:32.764 "base_bdevs_list": [ 00:22:32.764 { 00:22:32.764 "name": null, 00:22:32.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.764 "is_configured": false, 00:22:32.764 "data_offset": 2048, 00:22:32.764 "data_size": 63488 00:22:32.764 }, 00:22:32.764 { 00:22:32.764 "name": "BaseBdev2", 00:22:32.764 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:32.764 "is_configured": true, 00:22:32.764 "data_offset": 2048, 00:22:32.764 "data_size": 63488 00:22:32.764 }, 00:22:32.764 { 00:22:32.764 "name": "BaseBdev3", 00:22:32.764 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:32.764 "is_configured": true, 00:22:32.764 "data_offset": 2048, 00:22:32.764 "data_size": 63488 00:22:32.764 }, 00:22:32.764 { 00:22:32.764 "name": "BaseBdev4", 00:22:32.764 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:32.764 "is_configured": true, 00:22:32.764 "data_offset": 2048, 00:22:32.764 "data_size": 63488 00:22:32.764 } 00:22:32.764 ] 00:22:32.764 }' 00:22:32.764 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:32.764 11:58:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:33.328 11:59:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:33.586 [2024-05-14 11:59:00.531383] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:33.586 11:59:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # sleep 1 00:22:33.586 [2024-05-14 11:59:00.586878] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24834e0 00:22:33.586 [2024-05-14 11:59:00.589256] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:33.843 [2024-05-14 11:59:00.691022] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:33.843 [2024-05-14 11:59:00.691335] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:33.843 [2024-05-14 11:59:00.914878] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:33.843 [2024-05-14 11:59:00.915555] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.775 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:34.776 "name": "raid_bdev1", 00:22:34.776 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:34.776 "strip_size_kb": 0, 00:22:34.776 "state": "online", 00:22:34.776 "raid_level": "raid1", 00:22:34.776 "superblock": true, 00:22:34.776 "num_base_bdevs": 4, 00:22:34.776 "num_base_bdevs_discovered": 4, 00:22:34.776 "num_base_bdevs_operational": 4, 00:22:34.776 "process": { 00:22:34.776 "type": "rebuild", 00:22:34.776 "target": "spare", 00:22:34.776 "progress": { 00:22:34.776 "blocks": 14336, 00:22:34.776 "percent": 22 00:22:34.776 } 00:22:34.776 }, 00:22:34.776 "base_bdevs_list": [ 00:22:34.776 { 00:22:34.776 "name": "spare", 00:22:34.776 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:34.776 "is_configured": true, 00:22:34.776 "data_offset": 2048, 00:22:34.776 "data_size": 63488 00:22:34.776 }, 00:22:34.776 { 00:22:34.776 "name": "BaseBdev2", 00:22:34.776 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:34.776 "is_configured": true, 00:22:34.776 "data_offset": 2048, 00:22:34.776 "data_size": 63488 00:22:34.776 }, 00:22:34.776 { 00:22:34.776 "name": "BaseBdev3", 00:22:34.776 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:34.776 "is_configured": true, 00:22:34.776 "data_offset": 2048, 00:22:34.776 "data_size": 63488 00:22:34.776 }, 00:22:34.776 { 00:22:34.776 "name": "BaseBdev4", 00:22:34.776 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:34.776 "is_configured": true, 00:22:34.776 "data_offset": 2048, 00:22:34.776 "data_size": 63488 00:22:34.776 } 00:22:34.776 ] 00:22:34.776 }' 00:22:34.776 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:35.034 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:35.034 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:35.034 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.034 11:59:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:35.292 [2024-05-14 11:59:02.145747] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:35.292 [2024-05-14 11:59:02.175046] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:35.292 [2024-05-14 11:59:02.277495] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:35.292 [2024-05-14 11:59:02.290209] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.292 [2024-05-14 11:59:02.312936] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22d17a0 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.292 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.549 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:35.549 "name": "raid_bdev1", 00:22:35.549 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:35.549 "strip_size_kb": 0, 00:22:35.549 "state": "online", 00:22:35.549 "raid_level": "raid1", 00:22:35.549 "superblock": true, 00:22:35.549 "num_base_bdevs": 4, 00:22:35.549 "num_base_bdevs_discovered": 3, 00:22:35.549 "num_base_bdevs_operational": 3, 00:22:35.549 "base_bdevs_list": [ 00:22:35.549 { 00:22:35.549 "name": null, 00:22:35.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.549 "is_configured": false, 00:22:35.549 "data_offset": 2048, 00:22:35.549 "data_size": 63488 00:22:35.549 }, 00:22:35.549 { 00:22:35.549 "name": "BaseBdev2", 00:22:35.549 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:35.549 "is_configured": true, 00:22:35.549 "data_offset": 2048, 00:22:35.549 "data_size": 63488 00:22:35.549 }, 00:22:35.549 { 00:22:35.549 "name": "BaseBdev3", 00:22:35.549 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:35.549 "is_configured": true, 00:22:35.549 "data_offset": 2048, 00:22:35.549 "data_size": 63488 00:22:35.549 }, 00:22:35.549 { 00:22:35.549 "name": "BaseBdev4", 00:22:35.549 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:35.549 "is_configured": true, 00:22:35.549 "data_offset": 2048, 00:22:35.549 "data_size": 63488 00:22:35.549 } 00:22:35.549 ] 00:22:35.549 }' 00:22:35.549 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:35.549 11:59:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.483 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:36.483 "name": "raid_bdev1", 00:22:36.483 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:36.483 "strip_size_kb": 0, 00:22:36.483 "state": "online", 00:22:36.483 "raid_level": "raid1", 00:22:36.483 "superblock": true, 00:22:36.483 "num_base_bdevs": 4, 00:22:36.483 "num_base_bdevs_discovered": 3, 00:22:36.483 "num_base_bdevs_operational": 3, 00:22:36.483 "base_bdevs_list": [ 00:22:36.483 { 00:22:36.483 "name": null, 00:22:36.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.483 "is_configured": false, 00:22:36.483 "data_offset": 2048, 00:22:36.484 "data_size": 63488 00:22:36.484 }, 00:22:36.484 { 00:22:36.484 "name": "BaseBdev2", 00:22:36.484 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:36.484 "is_configured": true, 00:22:36.484 "data_offset": 2048, 00:22:36.484 "data_size": 63488 00:22:36.484 }, 00:22:36.484 { 00:22:36.484 "name": "BaseBdev3", 00:22:36.484 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:36.484 "is_configured": true, 00:22:36.484 "data_offset": 2048, 00:22:36.484 "data_size": 63488 00:22:36.484 }, 00:22:36.484 { 00:22:36.484 "name": "BaseBdev4", 00:22:36.484 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:36.484 "is_configured": true, 00:22:36.484 "data_offset": 2048, 00:22:36.484 "data_size": 63488 00:22:36.484 } 00:22:36.484 ] 00:22:36.484 }' 00:22:36.484 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:36.484 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:36.484 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:36.741 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:36.741 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:36.741 [2024-05-14 11:59:03.822343] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:36.999 [2024-05-14 11:59:03.859298] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d1980 00:22:36.999 [2024-05-14 11:59:03.860822] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:36.999 11:59:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # sleep 1 00:22:36.999 [2024-05-14 11:59:03.972961] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:36.999 [2024-05-14 11:59:03.974249] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:37.257 [2024-05-14 11:59:04.184826] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:37.257 [2024-05-14 11:59:04.184999] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:37.515 [2024-05-14 11:59:04.569377] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:37.515 [2024-05-14 11:59:04.570736] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:37.773 [2024-05-14 11:59:04.773810] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:37.773 [2024-05-14 11:59:04.773991] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:38.031 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:38.031 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:38.031 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:38.031 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:38.031 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:38.032 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.032 11:59:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.032 [2024-05-14 11:59:05.047584] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:38.290 "name": "raid_bdev1", 00:22:38.290 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:38.290 "strip_size_kb": 0, 00:22:38.290 "state": "online", 00:22:38.290 "raid_level": "raid1", 00:22:38.290 "superblock": true, 00:22:38.290 "num_base_bdevs": 4, 00:22:38.290 "num_base_bdevs_discovered": 4, 00:22:38.290 "num_base_bdevs_operational": 4, 00:22:38.290 "process": { 00:22:38.290 "type": "rebuild", 00:22:38.290 "target": "spare", 00:22:38.290 "progress": { 00:22:38.290 "blocks": 14336, 00:22:38.290 "percent": 22 00:22:38.290 } 00:22:38.290 }, 00:22:38.290 "base_bdevs_list": [ 00:22:38.290 { 00:22:38.290 "name": "spare", 00:22:38.290 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:38.290 "is_configured": true, 00:22:38.290 "data_offset": 2048, 00:22:38.290 "data_size": 63488 00:22:38.290 }, 00:22:38.290 { 00:22:38.290 "name": "BaseBdev2", 00:22:38.290 "uuid": "12b2d951-3bc9-5700-acd8-b560a6427c1d", 00:22:38.290 "is_configured": true, 00:22:38.290 "data_offset": 2048, 00:22:38.290 "data_size": 63488 00:22:38.290 }, 00:22:38.290 { 00:22:38.290 "name": "BaseBdev3", 00:22:38.290 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:38.290 "is_configured": true, 00:22:38.290 "data_offset": 2048, 00:22:38.290 "data_size": 63488 00:22:38.290 }, 00:22:38.290 { 00:22:38.290 "name": "BaseBdev4", 00:22:38.290 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:38.290 "is_configured": true, 00:22:38.290 "data_offset": 2048, 00:22:38.290 "data_size": 63488 00:22:38.290 } 00:22:38.290 ] 00:22:38.290 }' 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:38.290 [2024-05-14 11:59:05.159239] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:22:38.290 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=4 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # '[' 4 -gt 2 ']' 00:22:38.290 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@700 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:38.549 [2024-05-14 11:59:05.449582] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:38.549 [2024-05-14 11:59:05.533795] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22d17a0 00:22:38.549 [2024-05-14 11:59:05.533820] bdev_raid.c:1957:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22d1980 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@703 -- # base_bdevs[1]= 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@704 -- # (( num_base_bdevs_operational-- )) 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.807 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.807 [2024-05-14 11:59:05.887000] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:39.066 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:39.066 "name": "raid_bdev1", 00:22:39.066 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:39.066 "strip_size_kb": 0, 00:22:39.066 "state": "online", 00:22:39.066 "raid_level": "raid1", 00:22:39.066 "superblock": true, 00:22:39.066 "num_base_bdevs": 4, 00:22:39.066 "num_base_bdevs_discovered": 3, 00:22:39.066 "num_base_bdevs_operational": 3, 00:22:39.066 "process": { 00:22:39.066 "type": "rebuild", 00:22:39.066 "target": "spare", 00:22:39.066 "progress": { 00:22:39.066 "blocks": 28672, 00:22:39.066 "percent": 45 00:22:39.066 } 00:22:39.066 }, 00:22:39.066 "base_bdevs_list": [ 00:22:39.066 { 00:22:39.066 "name": "spare", 00:22:39.066 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:39.066 "is_configured": true, 00:22:39.066 "data_offset": 2048, 00:22:39.066 "data_size": 63488 00:22:39.066 }, 00:22:39.066 { 00:22:39.066 "name": null, 00:22:39.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.066 "is_configured": false, 00:22:39.066 "data_offset": 2048, 00:22:39.066 "data_size": 63488 00:22:39.066 }, 00:22:39.066 { 00:22:39.066 "name": "BaseBdev3", 00:22:39.066 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:39.066 "is_configured": true, 00:22:39.066 "data_offset": 2048, 00:22:39.066 "data_size": 63488 00:22:39.066 }, 00:22:39.066 { 00:22:39.066 "name": "BaseBdev4", 00:22:39.066 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:39.066 "is_configured": true, 00:22:39.066 "data_offset": 2048, 00:22:39.066 "data_size": 63488 00:22:39.066 } 00:22:39.066 ] 00:22:39.066 }' 00:22:39.066 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:39.066 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.066 11:59:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # local timeout=802 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.066 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:39.324 "name": "raid_bdev1", 00:22:39.324 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:39.324 "strip_size_kb": 0, 00:22:39.324 "state": "online", 00:22:39.324 "raid_level": "raid1", 00:22:39.324 "superblock": true, 00:22:39.324 "num_base_bdevs": 4, 00:22:39.324 "num_base_bdevs_discovered": 3, 00:22:39.324 "num_base_bdevs_operational": 3, 00:22:39.324 "process": { 00:22:39.324 "type": "rebuild", 00:22:39.324 "target": "spare", 00:22:39.324 "progress": { 00:22:39.324 "blocks": 30720, 00:22:39.324 "percent": 48 00:22:39.324 } 00:22:39.324 }, 00:22:39.324 "base_bdevs_list": [ 00:22:39.324 { 00:22:39.324 "name": "spare", 00:22:39.324 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:39.324 "is_configured": true, 00:22:39.324 "data_offset": 2048, 00:22:39.324 "data_size": 63488 00:22:39.324 }, 00:22:39.324 { 00:22:39.324 "name": null, 00:22:39.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.324 "is_configured": false, 00:22:39.324 "data_offset": 2048, 00:22:39.324 "data_size": 63488 00:22:39.324 }, 00:22:39.324 { 00:22:39.324 "name": "BaseBdev3", 00:22:39.324 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:39.324 "is_configured": true, 00:22:39.324 "data_offset": 2048, 00:22:39.324 "data_size": 63488 00:22:39.324 }, 00:22:39.324 { 00:22:39.324 "name": "BaseBdev4", 00:22:39.324 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:39.324 "is_configured": true, 00:22:39.324 "data_offset": 2048, 00:22:39.324 "data_size": 63488 00:22:39.324 } 00:22:39.324 ] 00:22:39.324 }' 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.324 11:59:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:39.891 [2024-05-14 11:59:06.676016] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:39.891 [2024-05-14 11:59:06.676203] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.458 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.716 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:40.716 "name": "raid_bdev1", 00:22:40.716 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:40.716 "strip_size_kb": 0, 00:22:40.716 "state": "online", 00:22:40.716 "raid_level": "raid1", 00:22:40.716 "superblock": true, 00:22:40.716 "num_base_bdevs": 4, 00:22:40.716 "num_base_bdevs_discovered": 3, 00:22:40.716 "num_base_bdevs_operational": 3, 00:22:40.716 "process": { 00:22:40.716 "type": "rebuild", 00:22:40.716 "target": "spare", 00:22:40.716 "progress": { 00:22:40.716 "blocks": 55296, 00:22:40.716 "percent": 87 00:22:40.716 } 00:22:40.716 }, 00:22:40.716 "base_bdevs_list": [ 00:22:40.716 { 00:22:40.716 "name": "spare", 00:22:40.716 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:40.716 "is_configured": true, 00:22:40.716 "data_offset": 2048, 00:22:40.716 "data_size": 63488 00:22:40.716 }, 00:22:40.716 { 00:22:40.716 "name": null, 00:22:40.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.716 "is_configured": false, 00:22:40.716 "data_offset": 2048, 00:22:40.716 "data_size": 63488 00:22:40.716 }, 00:22:40.716 { 00:22:40.716 "name": "BaseBdev3", 00:22:40.716 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:40.716 "is_configured": true, 00:22:40.716 "data_offset": 2048, 00:22:40.716 "data_size": 63488 00:22:40.716 }, 00:22:40.716 { 00:22:40.716 "name": "BaseBdev4", 00:22:40.717 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:40.717 "is_configured": true, 00:22:40.717 "data_offset": 2048, 00:22:40.717 "data_size": 63488 00:22:40.717 } 00:22:40.717 ] 00:22:40.717 }' 00:22:40.717 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:40.717 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:40.717 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:40.717 [2024-05-14 11:59:07.691135] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:40.717 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:40.717 11:59:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # sleep 1 00:22:41.283 [2024-05-14 11:59:08.143125] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:41.283 [2024-05-14 11:59:08.243412] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:41.283 [2024-05-14 11:59:08.245385] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.850 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.109 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:42.109 "name": "raid_bdev1", 00:22:42.109 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:42.109 "strip_size_kb": 0, 00:22:42.109 "state": "online", 00:22:42.109 "raid_level": "raid1", 00:22:42.109 "superblock": true, 00:22:42.109 "num_base_bdevs": 4, 00:22:42.109 "num_base_bdevs_discovered": 3, 00:22:42.109 "num_base_bdevs_operational": 3, 00:22:42.109 "base_bdevs_list": [ 00:22:42.109 { 00:22:42.109 "name": "spare", 00:22:42.109 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:42.109 "is_configured": true, 00:22:42.109 "data_offset": 2048, 00:22:42.109 "data_size": 63488 00:22:42.109 }, 00:22:42.109 { 00:22:42.109 "name": null, 00:22:42.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.109 "is_configured": false, 00:22:42.109 "data_offset": 2048, 00:22:42.109 "data_size": 63488 00:22:42.109 }, 00:22:42.109 { 00:22:42.109 "name": "BaseBdev3", 00:22:42.109 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:42.109 "is_configured": true, 00:22:42.109 "data_offset": 2048, 00:22:42.109 "data_size": 63488 00:22:42.109 }, 00:22:42.109 { 00:22:42.109 "name": "BaseBdev4", 00:22:42.109 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:42.109 "is_configured": true, 00:22:42.109 "data_offset": 2048, 00:22:42.109 "data_size": 63488 00:22:42.109 } 00:22:42.109 ] 00:22:42.109 }' 00:22:42.109 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:42.109 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:42.109 11:59:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # break 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.109 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:42.368 "name": "raid_bdev1", 00:22:42.368 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:42.368 "strip_size_kb": 0, 00:22:42.368 "state": "online", 00:22:42.368 "raid_level": "raid1", 00:22:42.368 "superblock": true, 00:22:42.368 "num_base_bdevs": 4, 00:22:42.368 "num_base_bdevs_discovered": 3, 00:22:42.368 "num_base_bdevs_operational": 3, 00:22:42.368 "base_bdevs_list": [ 00:22:42.368 { 00:22:42.368 "name": "spare", 00:22:42.368 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:42.368 "is_configured": true, 00:22:42.368 "data_offset": 2048, 00:22:42.368 "data_size": 63488 00:22:42.368 }, 00:22:42.368 { 00:22:42.368 "name": null, 00:22:42.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.368 "is_configured": false, 00:22:42.368 "data_offset": 2048, 00:22:42.368 "data_size": 63488 00:22:42.368 }, 00:22:42.368 { 00:22:42.368 "name": "BaseBdev3", 00:22:42.368 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:42.368 "is_configured": true, 00:22:42.368 "data_offset": 2048, 00:22:42.368 "data_size": 63488 00:22:42.368 }, 00:22:42.368 { 00:22:42.368 "name": "BaseBdev4", 00:22:42.368 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:42.368 "is_configured": true, 00:22:42.368 "data_offset": 2048, 00:22:42.368 "data_size": 63488 00:22:42.368 } 00:22:42.368 ] 00:22:42.368 }' 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:42.368 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:42.369 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:42.369 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.369 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.627 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:42.627 "name": "raid_bdev1", 00:22:42.627 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:42.627 "strip_size_kb": 0, 00:22:42.627 "state": "online", 00:22:42.627 "raid_level": "raid1", 00:22:42.627 "superblock": true, 00:22:42.627 "num_base_bdevs": 4, 00:22:42.627 "num_base_bdevs_discovered": 3, 00:22:42.627 "num_base_bdevs_operational": 3, 00:22:42.627 "base_bdevs_list": [ 00:22:42.627 { 00:22:42.627 "name": "spare", 00:22:42.627 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:42.627 "is_configured": true, 00:22:42.627 "data_offset": 2048, 00:22:42.627 "data_size": 63488 00:22:42.627 }, 00:22:42.627 { 00:22:42.627 "name": null, 00:22:42.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.627 "is_configured": false, 00:22:42.627 "data_offset": 2048, 00:22:42.627 "data_size": 63488 00:22:42.627 }, 00:22:42.627 { 00:22:42.627 "name": "BaseBdev3", 00:22:42.627 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:42.627 "is_configured": true, 00:22:42.627 "data_offset": 2048, 00:22:42.627 "data_size": 63488 00:22:42.627 }, 00:22:42.627 { 00:22:42.627 "name": "BaseBdev4", 00:22:42.627 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:42.627 "is_configured": true, 00:22:42.627 "data_offset": 2048, 00:22:42.627 "data_size": 63488 00:22:42.627 } 00:22:42.627 ] 00:22:42.627 }' 00:22:42.627 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:42.627 11:59:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:43.194 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:43.453 [2024-05-14 11:59:10.417249] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.453 [2024-05-14 11:59:10.417278] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:43.453 00:22:43.453 Latency(us) 00:22:43.453 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:43.453 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:43.453 raid_bdev1 : 11.16 88.66 265.97 0.00 0.00 15984.95 295.62 119446.48 00:22:43.453 =================================================================================================================== 00:22:43.453 Total : 88.66 265.97 0.00 0.00 15984.95 295.62 119446.48 00:22:43.453 [2024-05-14 11:59:10.461286] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.453 [2024-05-14 11:59:10.461314] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:43.453 [2024-05-14 11:59:10.461421] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:43.453 [2024-05-14 11:59:10.461434] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d99a0 name raid_bdev1, state offline 00:22:43.453 0 00:22:43.453 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.453 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # jq length 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@728 -- # '[' true = true ']' 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.711 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:43.970 /dev/nbd0 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:43.970 11:59:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:43.970 1+0 records in 00:22:43.970 1+0 records out 00:22:43.970 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257005 s, 15.9 MB/s 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z '' ']' 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # continue 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev3 ']' 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:43.970 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:22:44.229 /dev/nbd1 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:44.229 1+0 records in 00:22:44.229 1+0 records out 00:22:44.229 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262939 s, 15.6 MB/s 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:44.229 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:44.487 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # for bdev in "${base_bdevs[@]:1}" 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # '[' -z BaseBdev4 ']' 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:44.746 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:22:45.004 /dev/nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:45.004 1+0 records in 00:22:45.004 1+0 records out 00:22:45.004 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288781 s, 14.2 MB/s 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@736 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:45.004 11:59:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:45.262 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:22:45.519 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:45.776 11:59:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:46.034 [2024-05-14 11:59:12.995033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:46.034 [2024-05-14 11:59:12.995089] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.034 [2024-05-14 11:59:12.995109] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22db620 00:22:46.034 [2024-05-14 11:59:12.995122] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.034 [2024-05-14 11:59:12.996723] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.034 [2024-05-14 11:59:12.996753] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:46.034 [2024-05-14 11:59:12.996820] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:46.034 [2024-05-14 11:59:12.996847] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:46.034 BaseBdev1 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z '' ']' 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # continue 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev3 ']' 00:22:46.034 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev3 00:22:46.330 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:46.613 [2024-05-14 11:59:13.484382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:46.613 [2024-05-14 11:59:13.484431] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.613 [2024-05-14 11:59:13.484450] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22ee640 00:22:46.613 [2024-05-14 11:59:13.484462] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.613 [2024-05-14 11:59:13.484783] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.613 [2024-05-14 11:59:13.484801] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:46.613 [2024-05-14 11:59:13.484860] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev3 00:22:46.613 [2024-05-14 11:59:13.484872] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev3 (4) greater than existing raid bdev raid_bdev1 (1) 00:22:46.613 [2024-05-14 11:59:13.484882] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:46.613 [2024-05-14 11:59:13.484898] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d1870 name raid_bdev1, state configuring 00:22:46.613 [2024-05-14 11:59:13.484927] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:46.613 BaseBdev3 00:22:46.613 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:22:46.613 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev4 ']' 00:22:46.613 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev4 00:22:46.872 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:47.131 [2024-05-14 11:59:13.973808] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:47.131 [2024-05-14 11:59:13.973846] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.131 [2024-05-14 11:59:13.973865] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d2f70 00:22:47.131 [2024-05-14 11:59:13.973877] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.131 [2024-05-14 11:59:13.974171] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.131 [2024-05-14 11:59:13.974194] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:47.131 [2024-05-14 11:59:13.974248] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev4 00:22:47.131 [2024-05-14 11:59:13.974267] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:47.131 BaseBdev4 00:22:47.131 11:59:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:47.390 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:47.390 [2024-05-14 11:59:14.459141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:47.390 [2024-05-14 11:59:14.459178] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.390 [2024-05-14 11:59:14.459196] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22db320 00:22:47.390 [2024-05-14 11:59:14.459209] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.390 [2024-05-14 11:59:14.459542] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.390 [2024-05-14 11:59:14.459560] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:47.390 [2024-05-14 11:59:14.459627] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:22:47.390 [2024-05-14 11:59:14.459646] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:47.390 spare 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.650 [2024-05-14 11:59:14.559977] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x2478190 00:22:47.650 [2024-05-14 11:59:14.559993] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:47.650 [2024-05-14 11:59:14.560177] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22db5f0 00:22:47.650 [2024-05-14 11:59:14.560324] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2478190 00:22:47.650 [2024-05-14 11:59:14.560335] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2478190 00:22:47.650 [2024-05-14 11:59:14.560458] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:47.650 "name": "raid_bdev1", 00:22:47.650 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:47.650 "strip_size_kb": 0, 00:22:47.650 "state": "online", 00:22:47.650 "raid_level": "raid1", 00:22:47.650 "superblock": true, 00:22:47.650 "num_base_bdevs": 4, 00:22:47.650 "num_base_bdevs_discovered": 3, 00:22:47.650 "num_base_bdevs_operational": 3, 00:22:47.650 "base_bdevs_list": [ 00:22:47.650 { 00:22:47.650 "name": "spare", 00:22:47.650 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:47.650 "is_configured": true, 00:22:47.650 "data_offset": 2048, 00:22:47.650 "data_size": 63488 00:22:47.650 }, 00:22:47.650 { 00:22:47.650 "name": null, 00:22:47.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.650 "is_configured": false, 00:22:47.650 "data_offset": 2048, 00:22:47.650 "data_size": 63488 00:22:47.650 }, 00:22:47.650 { 00:22:47.650 "name": "BaseBdev3", 00:22:47.650 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:47.650 "is_configured": true, 00:22:47.650 "data_offset": 2048, 00:22:47.650 "data_size": 63488 00:22:47.650 }, 00:22:47.650 { 00:22:47.650 "name": "BaseBdev4", 00:22:47.650 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:47.650 "is_configured": true, 00:22:47.650 "data_offset": 2048, 00:22:47.650 "data_size": 63488 00:22:47.650 } 00:22:47.650 ] 00:22:47.650 }' 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:47.650 11:59:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:48.586 "name": "raid_bdev1", 00:22:48.586 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:48.586 "strip_size_kb": 0, 00:22:48.586 "state": "online", 00:22:48.586 "raid_level": "raid1", 00:22:48.586 "superblock": true, 00:22:48.586 "num_base_bdevs": 4, 00:22:48.586 "num_base_bdevs_discovered": 3, 00:22:48.586 "num_base_bdevs_operational": 3, 00:22:48.586 "base_bdevs_list": [ 00:22:48.586 { 00:22:48.586 "name": "spare", 00:22:48.586 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:48.586 "is_configured": true, 00:22:48.586 "data_offset": 2048, 00:22:48.586 "data_size": 63488 00:22:48.586 }, 00:22:48.586 { 00:22:48.586 "name": null, 00:22:48.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.586 "is_configured": false, 00:22:48.586 "data_offset": 2048, 00:22:48.586 "data_size": 63488 00:22:48.586 }, 00:22:48.586 { 00:22:48.586 "name": "BaseBdev3", 00:22:48.586 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:48.586 "is_configured": true, 00:22:48.586 "data_offset": 2048, 00:22:48.586 "data_size": 63488 00:22:48.586 }, 00:22:48.586 { 00:22:48.586 "name": "BaseBdev4", 00:22:48.586 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:48.586 "is_configured": true, 00:22:48.586 "data_offset": 2048, 00:22:48.586 "data_size": 63488 00:22:48.586 } 00:22:48.586 ] 00:22:48.586 }' 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:48.586 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:48.846 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.846 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:48.846 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:22:48.846 11:59:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:49.104 [2024-05-14 11:59:16.127936] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.104 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.362 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:49.362 "name": "raid_bdev1", 00:22:49.362 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:49.362 "strip_size_kb": 0, 00:22:49.362 "state": "online", 00:22:49.362 "raid_level": "raid1", 00:22:49.362 "superblock": true, 00:22:49.362 "num_base_bdevs": 4, 00:22:49.362 "num_base_bdevs_discovered": 2, 00:22:49.362 "num_base_bdevs_operational": 2, 00:22:49.362 "base_bdevs_list": [ 00:22:49.362 { 00:22:49.362 "name": null, 00:22:49.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.362 "is_configured": false, 00:22:49.362 "data_offset": 2048, 00:22:49.362 "data_size": 63488 00:22:49.362 }, 00:22:49.362 { 00:22:49.362 "name": null, 00:22:49.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.362 "is_configured": false, 00:22:49.362 "data_offset": 2048, 00:22:49.362 "data_size": 63488 00:22:49.362 }, 00:22:49.362 { 00:22:49.362 "name": "BaseBdev3", 00:22:49.362 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:49.362 "is_configured": true, 00:22:49.362 "data_offset": 2048, 00:22:49.362 "data_size": 63488 00:22:49.362 }, 00:22:49.362 { 00:22:49.362 "name": "BaseBdev4", 00:22:49.362 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:49.362 "is_configured": true, 00:22:49.362 "data_offset": 2048, 00:22:49.362 "data_size": 63488 00:22:49.362 } 00:22:49.362 ] 00:22:49.362 }' 00:22:49.362 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:49.362 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:49.928 11:59:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:50.187 [2024-05-14 11:59:17.211232] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:50.187 [2024-05-14 11:59:17.211381] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:50.187 [2024-05-14 11:59:17.211405] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:50.187 [2024-05-14 11:59:17.211434] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:50.187 [2024-05-14 11:59:17.215870] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22efe70 00:22:50.187 [2024-05-14 11:59:17.217345] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:50.187 11:59:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # sleep 1 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:51.563 "name": "raid_bdev1", 00:22:51.563 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:51.563 "strip_size_kb": 0, 00:22:51.563 "state": "online", 00:22:51.563 "raid_level": "raid1", 00:22:51.563 "superblock": true, 00:22:51.563 "num_base_bdevs": 4, 00:22:51.563 "num_base_bdevs_discovered": 3, 00:22:51.563 "num_base_bdevs_operational": 3, 00:22:51.563 "process": { 00:22:51.563 "type": "rebuild", 00:22:51.563 "target": "spare", 00:22:51.563 "progress": { 00:22:51.563 "blocks": 24576, 00:22:51.563 "percent": 38 00:22:51.563 } 00:22:51.563 }, 00:22:51.563 "base_bdevs_list": [ 00:22:51.563 { 00:22:51.563 "name": "spare", 00:22:51.563 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:51.563 "is_configured": true, 00:22:51.563 "data_offset": 2048, 00:22:51.563 "data_size": 63488 00:22:51.563 }, 00:22:51.563 { 00:22:51.563 "name": null, 00:22:51.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.563 "is_configured": false, 00:22:51.563 "data_offset": 2048, 00:22:51.563 "data_size": 63488 00:22:51.563 }, 00:22:51.563 { 00:22:51.563 "name": "BaseBdev3", 00:22:51.563 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:51.563 "is_configured": true, 00:22:51.563 "data_offset": 2048, 00:22:51.563 "data_size": 63488 00:22:51.563 }, 00:22:51.563 { 00:22:51.563 "name": "BaseBdev4", 00:22:51.563 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:51.563 "is_configured": true, 00:22:51.563 "data_offset": 2048, 00:22:51.563 "data_size": 63488 00:22:51.563 } 00:22:51.563 ] 00:22:51.563 }' 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:51.563 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:51.564 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:51.822 [2024-05-14 11:59:18.810042] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.822 [2024-05-14 11:59:18.830032] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:51.822 [2024-05-14 11:59:18.830075] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.822 11:59:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.081 11:59:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:52.081 "name": "raid_bdev1", 00:22:52.081 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:52.081 "strip_size_kb": 0, 00:22:52.081 "state": "online", 00:22:52.081 "raid_level": "raid1", 00:22:52.081 "superblock": true, 00:22:52.081 "num_base_bdevs": 4, 00:22:52.081 "num_base_bdevs_discovered": 2, 00:22:52.081 "num_base_bdevs_operational": 2, 00:22:52.081 "base_bdevs_list": [ 00:22:52.081 { 00:22:52.081 "name": null, 00:22:52.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.081 "is_configured": false, 00:22:52.081 "data_offset": 2048, 00:22:52.081 "data_size": 63488 00:22:52.081 }, 00:22:52.081 { 00:22:52.081 "name": null, 00:22:52.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.081 "is_configured": false, 00:22:52.081 "data_offset": 2048, 00:22:52.081 "data_size": 63488 00:22:52.081 }, 00:22:52.081 { 00:22:52.081 "name": "BaseBdev3", 00:22:52.081 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:52.081 "is_configured": true, 00:22:52.081 "data_offset": 2048, 00:22:52.081 "data_size": 63488 00:22:52.081 }, 00:22:52.081 { 00:22:52.081 "name": "BaseBdev4", 00:22:52.081 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:52.081 "is_configured": true, 00:22:52.081 "data_offset": 2048, 00:22:52.081 "data_size": 63488 00:22:52.081 } 00:22:52.081 ] 00:22:52.081 }' 00:22:52.081 11:59:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:52.081 11:59:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:52.648 11:59:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:52.906 [2024-05-14 11:59:19.808983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:52.906 [2024-05-14 11:59:19.809030] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.906 [2024-05-14 11:59:19.809050] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22ef3e0 00:22:52.906 [2024-05-14 11:59:19.809063] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.906 [2024-05-14 11:59:19.809434] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.906 [2024-05-14 11:59:19.809452] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:52.906 [2024-05-14 11:59:19.809531] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:22:52.906 [2024-05-14 11:59:19.809543] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:52.906 [2024-05-14 11:59:19.809554] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:52.906 [2024-05-14 11:59:19.809573] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.906 [2024-05-14 11:59:19.813987] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe2090 00:22:52.906 spare 00:22:52.906 [2024-05-14 11:59:19.815341] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:52.906 11:59:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # sleep 1 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.841 11:59:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:54.099 "name": "raid_bdev1", 00:22:54.099 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:54.099 "strip_size_kb": 0, 00:22:54.099 "state": "online", 00:22:54.099 "raid_level": "raid1", 00:22:54.099 "superblock": true, 00:22:54.099 "num_base_bdevs": 4, 00:22:54.099 "num_base_bdevs_discovered": 3, 00:22:54.099 "num_base_bdevs_operational": 3, 00:22:54.099 "process": { 00:22:54.099 "type": "rebuild", 00:22:54.099 "target": "spare", 00:22:54.099 "progress": { 00:22:54.099 "blocks": 24576, 00:22:54.099 "percent": 38 00:22:54.099 } 00:22:54.099 }, 00:22:54.099 "base_bdevs_list": [ 00:22:54.099 { 00:22:54.099 "name": "spare", 00:22:54.099 "uuid": "152e65c8-c2db-5987-8e6f-bacfc779351e", 00:22:54.099 "is_configured": true, 00:22:54.099 "data_offset": 2048, 00:22:54.099 "data_size": 63488 00:22:54.099 }, 00:22:54.099 { 00:22:54.099 "name": null, 00:22:54.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.099 "is_configured": false, 00:22:54.099 "data_offset": 2048, 00:22:54.099 "data_size": 63488 00:22:54.099 }, 00:22:54.099 { 00:22:54.099 "name": "BaseBdev3", 00:22:54.099 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:54.099 "is_configured": true, 00:22:54.099 "data_offset": 2048, 00:22:54.099 "data_size": 63488 00:22:54.099 }, 00:22:54.099 { 00:22:54.099 "name": "BaseBdev4", 00:22:54.099 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:54.099 "is_configured": true, 00:22:54.099 "data_offset": 2048, 00:22:54.099 "data_size": 63488 00:22:54.099 } 00:22:54.099 ] 00:22:54.099 }' 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.099 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:54.358 [2024-05-14 11:59:21.395780] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.358 [2024-05-14 11:59:21.427772] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:54.358 [2024-05-14 11:59:21.427830] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.617 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.875 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:54.875 "name": "raid_bdev1", 00:22:54.875 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:54.875 "strip_size_kb": 0, 00:22:54.875 "state": "online", 00:22:54.875 "raid_level": "raid1", 00:22:54.875 "superblock": true, 00:22:54.875 "num_base_bdevs": 4, 00:22:54.875 "num_base_bdevs_discovered": 2, 00:22:54.875 "num_base_bdevs_operational": 2, 00:22:54.875 "base_bdevs_list": [ 00:22:54.875 { 00:22:54.875 "name": null, 00:22:54.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.875 "is_configured": false, 00:22:54.875 "data_offset": 2048, 00:22:54.875 "data_size": 63488 00:22:54.875 }, 00:22:54.875 { 00:22:54.875 "name": null, 00:22:54.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.875 "is_configured": false, 00:22:54.875 "data_offset": 2048, 00:22:54.875 "data_size": 63488 00:22:54.875 }, 00:22:54.875 { 00:22:54.875 "name": "BaseBdev3", 00:22:54.875 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:54.875 "is_configured": true, 00:22:54.875 "data_offset": 2048, 00:22:54.875 "data_size": 63488 00:22:54.875 }, 00:22:54.875 { 00:22:54.875 "name": "BaseBdev4", 00:22:54.875 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:54.875 "is_configured": true, 00:22:54.875 "data_offset": 2048, 00:22:54.875 "data_size": 63488 00:22:54.875 } 00:22:54.875 ] 00:22:54.875 }' 00:22:54.875 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:54.875 11:59:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.440 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:55.440 "name": "raid_bdev1", 00:22:55.440 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:55.440 "strip_size_kb": 0, 00:22:55.440 "state": "online", 00:22:55.440 "raid_level": "raid1", 00:22:55.440 "superblock": true, 00:22:55.440 "num_base_bdevs": 4, 00:22:55.440 "num_base_bdevs_discovered": 2, 00:22:55.440 "num_base_bdevs_operational": 2, 00:22:55.441 "base_bdevs_list": [ 00:22:55.441 { 00:22:55.441 "name": null, 00:22:55.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.441 "is_configured": false, 00:22:55.441 "data_offset": 2048, 00:22:55.441 "data_size": 63488 00:22:55.441 }, 00:22:55.441 { 00:22:55.441 "name": null, 00:22:55.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.441 "is_configured": false, 00:22:55.441 "data_offset": 2048, 00:22:55.441 "data_size": 63488 00:22:55.441 }, 00:22:55.441 { 00:22:55.441 "name": "BaseBdev3", 00:22:55.441 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:55.441 "is_configured": true, 00:22:55.441 "data_offset": 2048, 00:22:55.441 "data_size": 63488 00:22:55.441 }, 00:22:55.441 { 00:22:55.441 "name": "BaseBdev4", 00:22:55.441 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:55.441 "is_configured": true, 00:22:55.441 "data_offset": 2048, 00:22:55.441 "data_size": 63488 00:22:55.441 } 00:22:55.441 ] 00:22:55.441 }' 00:22:55.441 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:55.441 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:55.441 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:55.698 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:55.698 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:55.956 11:59:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:55.956 [2024-05-14 11:59:23.008579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:55.956 [2024-05-14 11:59:23.008627] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.956 [2024-05-14 11:59:23.008647] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d9520 00:22:55.956 [2024-05-14 11:59:23.008660] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.956 [2024-05-14 11:59:23.008990] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.956 [2024-05-14 11:59:23.009007] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:55.956 [2024-05-14 11:59:23.009070] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:55.956 [2024-05-14 11:59:23.009082] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:55.956 [2024-05-14 11:59:23.009092] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:55.956 BaseBdev1 00:22:55.956 11:59:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@786 -- # sleep 1 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:57.330 "name": "raid_bdev1", 00:22:57.330 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:57.330 "strip_size_kb": 0, 00:22:57.330 "state": "online", 00:22:57.330 "raid_level": "raid1", 00:22:57.330 "superblock": true, 00:22:57.330 "num_base_bdevs": 4, 00:22:57.330 "num_base_bdevs_discovered": 2, 00:22:57.330 "num_base_bdevs_operational": 2, 00:22:57.330 "base_bdevs_list": [ 00:22:57.330 { 00:22:57.330 "name": null, 00:22:57.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.330 "is_configured": false, 00:22:57.330 "data_offset": 2048, 00:22:57.330 "data_size": 63488 00:22:57.330 }, 00:22:57.330 { 00:22:57.330 "name": null, 00:22:57.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.330 "is_configured": false, 00:22:57.330 "data_offset": 2048, 00:22:57.330 "data_size": 63488 00:22:57.330 }, 00:22:57.330 { 00:22:57.330 "name": "BaseBdev3", 00:22:57.330 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:57.330 "is_configured": true, 00:22:57.330 "data_offset": 2048, 00:22:57.330 "data_size": 63488 00:22:57.330 }, 00:22:57.330 { 00:22:57.330 "name": "BaseBdev4", 00:22:57.330 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:57.330 "is_configured": true, 00:22:57.330 "data_offset": 2048, 00:22:57.330 "data_size": 63488 00:22:57.330 } 00:22:57.330 ] 00:22:57.330 }' 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:57.330 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.895 11:59:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.153 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:58.153 "name": "raid_bdev1", 00:22:58.153 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:58.153 "strip_size_kb": 0, 00:22:58.153 "state": "online", 00:22:58.153 "raid_level": "raid1", 00:22:58.153 "superblock": true, 00:22:58.153 "num_base_bdevs": 4, 00:22:58.153 "num_base_bdevs_discovered": 2, 00:22:58.153 "num_base_bdevs_operational": 2, 00:22:58.153 "base_bdevs_list": [ 00:22:58.153 { 00:22:58.153 "name": null, 00:22:58.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.153 "is_configured": false, 00:22:58.153 "data_offset": 2048, 00:22:58.153 "data_size": 63488 00:22:58.153 }, 00:22:58.153 { 00:22:58.153 "name": null, 00:22:58.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.153 "is_configured": false, 00:22:58.153 "data_offset": 2048, 00:22:58.153 "data_size": 63488 00:22:58.153 }, 00:22:58.153 { 00:22:58.153 "name": "BaseBdev3", 00:22:58.153 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:58.153 "is_configured": true, 00:22:58.153 "data_offset": 2048, 00:22:58.153 "data_size": 63488 00:22:58.153 }, 00:22:58.153 { 00:22:58.153 "name": "BaseBdev4", 00:22:58.153 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:58.153 "is_configured": true, 00:22:58.153 "data_offset": 2048, 00:22:58.153 "data_size": 63488 00:22:58.153 } 00:22:58.153 ] 00:22:58.153 }' 00:22:58.153 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:58.154 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:58.412 [2024-05-14 11:59:25.363217] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:58.412 [2024-05-14 11:59:25.363341] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:58.412 [2024-05-14 11:59:25.363356] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:58.412 request: 00:22:58.412 { 00:22:58.412 "raid_bdev": "raid_bdev1", 00:22:58.412 "base_bdev": "BaseBdev1", 00:22:58.412 "method": "bdev_raid_add_base_bdev", 00:22:58.412 "req_id": 1 00:22:58.412 } 00:22:58.412 Got JSON-RPC error response 00:22:58.412 response: 00:22:58.412 { 00:22:58.412 "code": -22, 00:22:58.412 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:58.412 } 00:22:58.412 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:22:58.412 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:58.412 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:58.412 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:58.412 11:59:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # sleep 1 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.343 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.601 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:59.601 "name": "raid_bdev1", 00:22:59.601 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:22:59.601 "strip_size_kb": 0, 00:22:59.601 "state": "online", 00:22:59.601 "raid_level": "raid1", 00:22:59.601 "superblock": true, 00:22:59.601 "num_base_bdevs": 4, 00:22:59.601 "num_base_bdevs_discovered": 2, 00:22:59.601 "num_base_bdevs_operational": 2, 00:22:59.601 "base_bdevs_list": [ 00:22:59.601 { 00:22:59.601 "name": null, 00:22:59.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.601 "is_configured": false, 00:22:59.601 "data_offset": 2048, 00:22:59.601 "data_size": 63488 00:22:59.601 }, 00:22:59.601 { 00:22:59.601 "name": null, 00:22:59.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.601 "is_configured": false, 00:22:59.601 "data_offset": 2048, 00:22:59.601 "data_size": 63488 00:22:59.601 }, 00:22:59.601 { 00:22:59.601 "name": "BaseBdev3", 00:22:59.601 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:22:59.601 "is_configured": true, 00:22:59.601 "data_offset": 2048, 00:22:59.601 "data_size": 63488 00:22:59.601 }, 00:22:59.601 { 00:22:59.601 "name": "BaseBdev4", 00:22:59.601 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:22:59.601 "is_configured": true, 00:22:59.601 "data_offset": 2048, 00:22:59.601 "data_size": 63488 00:22:59.601 } 00:22:59.601 ] 00:22:59.601 }' 00:22:59.601 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:59.601 11:59:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.165 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.424 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:00.424 "name": "raid_bdev1", 00:23:00.424 "uuid": "216d446e-602d-4b0c-960a-85baca1c3aec", 00:23:00.424 "strip_size_kb": 0, 00:23:00.424 "state": "online", 00:23:00.424 "raid_level": "raid1", 00:23:00.424 "superblock": true, 00:23:00.424 "num_base_bdevs": 4, 00:23:00.424 "num_base_bdevs_discovered": 2, 00:23:00.424 "num_base_bdevs_operational": 2, 00:23:00.424 "base_bdevs_list": [ 00:23:00.424 { 00:23:00.424 "name": null, 00:23:00.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.424 "is_configured": false, 00:23:00.424 "data_offset": 2048, 00:23:00.424 "data_size": 63488 00:23:00.424 }, 00:23:00.424 { 00:23:00.424 "name": null, 00:23:00.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.424 "is_configured": false, 00:23:00.424 "data_offset": 2048, 00:23:00.424 "data_size": 63488 00:23:00.424 }, 00:23:00.424 { 00:23:00.424 "name": "BaseBdev3", 00:23:00.424 "uuid": "97783928-a1d8-52d6-8b56-90960f216ad2", 00:23:00.424 "is_configured": true, 00:23:00.424 "data_offset": 2048, 00:23:00.424 "data_size": 63488 00:23:00.424 }, 00:23:00.424 { 00:23:00.424 "name": "BaseBdev4", 00:23:00.424 "uuid": "6772a974-9677-5bc3-a33e-c39cb728579b", 00:23:00.424 "is_configured": true, 00:23:00.424 "data_offset": 2048, 00:23:00.424 "data_size": 63488 00:23:00.424 } 00:23:00.424 ] 00:23:00.424 }' 00:23:00.424 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:00.424 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:00.424 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # killprocess 1777444 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 1777444 ']' 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 1777444 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1777444 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:00.683 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1777444' 00:23:00.683 killing process with pid 1777444 00:23:00.684 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 1777444 00:23:00.684 Received shutdown signal, test time was about 28.221369 seconds 00:23:00.684 00:23:00.684 Latency(us) 00:23:00.684 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:00.684 =================================================================================================================== 00:23:00.684 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:00.684 [2024-05-14 11:59:27.564713] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:00.684 [2024-05-14 11:59:27.564814] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:00.684 [2024-05-14 11:59:27.564870] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:00.684 [2024-05-14 11:59:27.564881] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2478190 name raid_bdev1, state offline 00:23:00.684 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 1777444 00:23:00.684 [2024-05-14 11:59:27.606954] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:00.943 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@797 -- # return 0 00:23:00.943 00:23:00.943 real 0m33.784s 00:23:00.943 user 0m53.618s 00:23:00.943 sys 0m5.297s 00:23:00.943 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:00.943 11:59:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:00.943 ************************************ 00:23:00.943 END TEST raid_rebuild_test_sb_io 00:23:00.943 ************************************ 00:23:00.943 11:59:27 bdev_raid -- bdev/bdev_raid.sh@830 -- # '[' n == y ']' 00:23:00.943 11:59:27 bdev_raid -- bdev/bdev_raid.sh@842 -- # base_blocklen=4096 00:23:00.943 11:59:27 bdev_raid -- bdev/bdev_raid.sh@844 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:23:00.943 11:59:27 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:23:00.943 11:59:27 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:00.943 11:59:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:00.943 ************************************ 00:23:00.943 START TEST raid_state_function_test_sb_4k 00:23:00.943 ************************************ 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # raid_pid=1782299 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1782299' 00:23:00.943 Process raid pid: 1782299 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@247 -- # waitforlisten 1782299 /var/tmp/spdk-raid.sock 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 1782299 ']' 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:00.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:00.943 11:59:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:00.943 [2024-05-14 11:59:27.975940] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:23:00.943 [2024-05-14 11:59:27.976001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:01.202 [2024-05-14 11:59:28.105574] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:01.202 [2024-05-14 11:59:28.210795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:01.202 [2024-05-14 11:59:28.283430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:01.202 [2024-05-14 11:59:28.283467] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:02.136 11:59:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:02.136 11:59:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:23:02.136 11:59:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:02.136 [2024-05-14 11:59:29.118644] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:02.136 [2024-05-14 11:59:29.118691] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:02.136 [2024-05-14 11:59:29.118703] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:02.136 [2024-05-14 11:59:29.118715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.137 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:02.393 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:02.393 "name": "Existed_Raid", 00:23:02.393 "uuid": "5210e0bd-efd0-457c-bbe0-40dd485e053f", 00:23:02.393 "strip_size_kb": 0, 00:23:02.393 "state": "configuring", 00:23:02.393 "raid_level": "raid1", 00:23:02.393 "superblock": true, 00:23:02.393 "num_base_bdevs": 2, 00:23:02.393 "num_base_bdevs_discovered": 0, 00:23:02.393 "num_base_bdevs_operational": 2, 00:23:02.393 "base_bdevs_list": [ 00:23:02.393 { 00:23:02.393 "name": "BaseBdev1", 00:23:02.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.393 "is_configured": false, 00:23:02.393 "data_offset": 0, 00:23:02.393 "data_size": 0 00:23:02.393 }, 00:23:02.393 { 00:23:02.393 "name": "BaseBdev2", 00:23:02.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.393 "is_configured": false, 00:23:02.393 "data_offset": 0, 00:23:02.393 "data_size": 0 00:23:02.393 } 00:23:02.393 ] 00:23:02.393 }' 00:23:02.393 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:02.393 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:02.958 11:59:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:03.216 [2024-05-14 11:59:30.189326] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:03.216 [2024-05-14 11:59:30.189362] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200c700 name Existed_Raid, state configuring 00:23:03.216 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:03.474 [2024-05-14 11:59:30.429982] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:03.474 [2024-05-14 11:59:30.430013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:03.474 [2024-05-14 11:59:30.430023] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:03.474 [2024-05-14 11:59:30.430035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:03.474 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:23:03.734 [2024-05-14 11:59:30.673656] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:03.734 BaseBdev1 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:03.734 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:03.997 11:59:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:03.997 [ 00:23:03.997 { 00:23:03.997 "name": "BaseBdev1", 00:23:03.997 "aliases": [ 00:23:03.997 "9702e6c8-6c2e-4adc-80e9-18cf962b786e" 00:23:03.997 ], 00:23:03.997 "product_name": "Malloc disk", 00:23:03.997 "block_size": 4096, 00:23:03.997 "num_blocks": 8192, 00:23:03.997 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:03.997 "assigned_rate_limits": { 00:23:03.997 "rw_ios_per_sec": 0, 00:23:03.997 "rw_mbytes_per_sec": 0, 00:23:03.997 "r_mbytes_per_sec": 0, 00:23:03.997 "w_mbytes_per_sec": 0 00:23:03.997 }, 00:23:03.997 "claimed": true, 00:23:03.997 "claim_type": "exclusive_write", 00:23:03.997 "zoned": false, 00:23:03.997 "supported_io_types": { 00:23:03.997 "read": true, 00:23:03.997 "write": true, 00:23:03.997 "unmap": true, 00:23:03.997 "write_zeroes": true, 00:23:03.997 "flush": true, 00:23:03.997 "reset": true, 00:23:03.997 "compare": false, 00:23:03.997 "compare_and_write": false, 00:23:03.997 "abort": true, 00:23:03.997 "nvme_admin": false, 00:23:03.997 "nvme_io": false 00:23:03.997 }, 00:23:03.997 "memory_domains": [ 00:23:03.997 { 00:23:03.997 "dma_device_id": "system", 00:23:03.997 "dma_device_type": 1 00:23:03.997 }, 00:23:03.997 { 00:23:03.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.997 "dma_device_type": 2 00:23:03.997 } 00:23:03.997 ], 00:23:03.997 "driver_specific": {} 00:23:03.997 } 00:23:03.997 ] 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:04.297 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:04.297 "name": "Existed_Raid", 00:23:04.297 "uuid": "517b74e2-2619-401a-9d56-f9906817f9b8", 00:23:04.297 "strip_size_kb": 0, 00:23:04.297 "state": "configuring", 00:23:04.297 "raid_level": "raid1", 00:23:04.297 "superblock": true, 00:23:04.297 "num_base_bdevs": 2, 00:23:04.297 "num_base_bdevs_discovered": 1, 00:23:04.298 "num_base_bdevs_operational": 2, 00:23:04.298 "base_bdevs_list": [ 00:23:04.298 { 00:23:04.298 "name": "BaseBdev1", 00:23:04.298 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:04.298 "is_configured": true, 00:23:04.298 "data_offset": 256, 00:23:04.298 "data_size": 7936 00:23:04.298 }, 00:23:04.298 { 00:23:04.298 "name": "BaseBdev2", 00:23:04.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.298 "is_configured": false, 00:23:04.298 "data_offset": 0, 00:23:04.298 "data_size": 0 00:23:04.298 } 00:23:04.298 ] 00:23:04.298 }' 00:23:04.298 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:04.298 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:04.874 11:59:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:05.133 [2024-05-14 11:59:32.009236] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:05.134 [2024-05-14 11:59:32.009270] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200c9a0 name Existed_Raid, state configuring 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:05.134 [2024-05-14 11:59:32.181732] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:05.134 [2024-05-14 11:59:32.183228] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:05.134 [2024-05-14 11:59:32.183260] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:05.134 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.397 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:05.397 "name": "Existed_Raid", 00:23:05.397 "uuid": "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18", 00:23:05.397 "strip_size_kb": 0, 00:23:05.397 "state": "configuring", 00:23:05.397 "raid_level": "raid1", 00:23:05.397 "superblock": true, 00:23:05.397 "num_base_bdevs": 2, 00:23:05.397 "num_base_bdevs_discovered": 1, 00:23:05.398 "num_base_bdevs_operational": 2, 00:23:05.398 "base_bdevs_list": [ 00:23:05.398 { 00:23:05.398 "name": "BaseBdev1", 00:23:05.398 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:05.398 "is_configured": true, 00:23:05.398 "data_offset": 256, 00:23:05.398 "data_size": 7936 00:23:05.398 }, 00:23:05.398 { 00:23:05.398 "name": "BaseBdev2", 00:23:05.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.398 "is_configured": false, 00:23:05.398 "data_offset": 0, 00:23:05.398 "data_size": 0 00:23:05.398 } 00:23:05.398 ] 00:23:05.398 }' 00:23:05.398 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:05.398 11:59:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:05.965 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:23:06.223 [2024-05-14 11:59:33.271902] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:06.223 [2024-05-14 11:59:33.272048] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x200bff0 00:23:06.223 [2024-05-14 11:59:33.272061] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:06.223 [2024-05-14 11:59:33.272234] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200e810 00:23:06.223 [2024-05-14 11:59:33.272357] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x200bff0 00:23:06.223 [2024-05-14 11:59:33.272368] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x200bff0 00:23:06.223 [2024-05-14 11:59:33.272476] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.223 BaseBdev2 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:06.223 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:06.481 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:06.740 [ 00:23:06.740 { 00:23:06.740 "name": "BaseBdev2", 00:23:06.740 "aliases": [ 00:23:06.740 "22c298f3-9868-4245-96cd-448a9608072a" 00:23:06.740 ], 00:23:06.740 "product_name": "Malloc disk", 00:23:06.740 "block_size": 4096, 00:23:06.740 "num_blocks": 8192, 00:23:06.740 "uuid": "22c298f3-9868-4245-96cd-448a9608072a", 00:23:06.740 "assigned_rate_limits": { 00:23:06.740 "rw_ios_per_sec": 0, 00:23:06.740 "rw_mbytes_per_sec": 0, 00:23:06.740 "r_mbytes_per_sec": 0, 00:23:06.740 "w_mbytes_per_sec": 0 00:23:06.740 }, 00:23:06.740 "claimed": true, 00:23:06.740 "claim_type": "exclusive_write", 00:23:06.740 "zoned": false, 00:23:06.740 "supported_io_types": { 00:23:06.740 "read": true, 00:23:06.740 "write": true, 00:23:06.740 "unmap": true, 00:23:06.740 "write_zeroes": true, 00:23:06.740 "flush": true, 00:23:06.740 "reset": true, 00:23:06.740 "compare": false, 00:23:06.740 "compare_and_write": false, 00:23:06.740 "abort": true, 00:23:06.740 "nvme_admin": false, 00:23:06.740 "nvme_io": false 00:23:06.740 }, 00:23:06.740 "memory_domains": [ 00:23:06.740 { 00:23:06.740 "dma_device_id": "system", 00:23:06.740 "dma_device_type": 1 00:23:06.740 }, 00:23:06.740 { 00:23:06.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.740 "dma_device_type": 2 00:23:06.740 } 00:23:06.740 ], 00:23:06.740 "driver_specific": {} 00:23:06.740 } 00:23:06.740 ] 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.740 11:59:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:06.998 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:06.998 "name": "Existed_Raid", 00:23:06.998 "uuid": "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18", 00:23:06.998 "strip_size_kb": 0, 00:23:06.998 "state": "online", 00:23:06.998 "raid_level": "raid1", 00:23:06.998 "superblock": true, 00:23:06.998 "num_base_bdevs": 2, 00:23:06.998 "num_base_bdevs_discovered": 2, 00:23:06.998 "num_base_bdevs_operational": 2, 00:23:06.998 "base_bdevs_list": [ 00:23:06.998 { 00:23:06.998 "name": "BaseBdev1", 00:23:06.998 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:06.998 "is_configured": true, 00:23:06.998 "data_offset": 256, 00:23:06.999 "data_size": 7936 00:23:06.999 }, 00:23:06.999 { 00:23:06.999 "name": "BaseBdev2", 00:23:06.999 "uuid": "22c298f3-9868-4245-96cd-448a9608072a", 00:23:06.999 "is_configured": true, 00:23:06.999 "data_offset": 256, 00:23:06.999 "data_size": 7936 00:23:06.999 } 00:23:06.999 ] 00:23:06.999 }' 00:23:06.999 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:06.999 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:07.566 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:07.824 [2024-05-14 11:59:34.796179] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:07.824 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:07.825 "name": "Existed_Raid", 00:23:07.825 "aliases": [ 00:23:07.825 "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18" 00:23:07.825 ], 00:23:07.825 "product_name": "Raid Volume", 00:23:07.825 "block_size": 4096, 00:23:07.825 "num_blocks": 7936, 00:23:07.825 "uuid": "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18", 00:23:07.825 "assigned_rate_limits": { 00:23:07.825 "rw_ios_per_sec": 0, 00:23:07.825 "rw_mbytes_per_sec": 0, 00:23:07.825 "r_mbytes_per_sec": 0, 00:23:07.825 "w_mbytes_per_sec": 0 00:23:07.825 }, 00:23:07.825 "claimed": false, 00:23:07.825 "zoned": false, 00:23:07.825 "supported_io_types": { 00:23:07.825 "read": true, 00:23:07.825 "write": true, 00:23:07.825 "unmap": false, 00:23:07.825 "write_zeroes": true, 00:23:07.825 "flush": false, 00:23:07.825 "reset": true, 00:23:07.825 "compare": false, 00:23:07.825 "compare_and_write": false, 00:23:07.825 "abort": false, 00:23:07.825 "nvme_admin": false, 00:23:07.825 "nvme_io": false 00:23:07.825 }, 00:23:07.825 "memory_domains": [ 00:23:07.825 { 00:23:07.825 "dma_device_id": "system", 00:23:07.825 "dma_device_type": 1 00:23:07.825 }, 00:23:07.825 { 00:23:07.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.825 "dma_device_type": 2 00:23:07.825 }, 00:23:07.825 { 00:23:07.825 "dma_device_id": "system", 00:23:07.825 "dma_device_type": 1 00:23:07.825 }, 00:23:07.825 { 00:23:07.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.825 "dma_device_type": 2 00:23:07.825 } 00:23:07.825 ], 00:23:07.825 "driver_specific": { 00:23:07.825 "raid": { 00:23:07.825 "uuid": "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18", 00:23:07.825 "strip_size_kb": 0, 00:23:07.825 "state": "online", 00:23:07.825 "raid_level": "raid1", 00:23:07.825 "superblock": true, 00:23:07.825 "num_base_bdevs": 2, 00:23:07.825 "num_base_bdevs_discovered": 2, 00:23:07.825 "num_base_bdevs_operational": 2, 00:23:07.825 "base_bdevs_list": [ 00:23:07.825 { 00:23:07.825 "name": "BaseBdev1", 00:23:07.825 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:07.825 "is_configured": true, 00:23:07.825 "data_offset": 256, 00:23:07.825 "data_size": 7936 00:23:07.825 }, 00:23:07.825 { 00:23:07.825 "name": "BaseBdev2", 00:23:07.825 "uuid": "22c298f3-9868-4245-96cd-448a9608072a", 00:23:07.825 "is_configured": true, 00:23:07.825 "data_offset": 256, 00:23:07.825 "data_size": 7936 00:23:07.825 } 00:23:07.825 ] 00:23:07.825 } 00:23:07.825 } 00:23:07.825 }' 00:23:07.825 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:07.825 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:23:07.825 BaseBdev2' 00:23:07.825 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:07.825 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:07.825 11:59:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:08.085 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:08.085 "name": "BaseBdev1", 00:23:08.085 "aliases": [ 00:23:08.085 "9702e6c8-6c2e-4adc-80e9-18cf962b786e" 00:23:08.085 ], 00:23:08.085 "product_name": "Malloc disk", 00:23:08.085 "block_size": 4096, 00:23:08.085 "num_blocks": 8192, 00:23:08.085 "uuid": "9702e6c8-6c2e-4adc-80e9-18cf962b786e", 00:23:08.085 "assigned_rate_limits": { 00:23:08.085 "rw_ios_per_sec": 0, 00:23:08.085 "rw_mbytes_per_sec": 0, 00:23:08.085 "r_mbytes_per_sec": 0, 00:23:08.085 "w_mbytes_per_sec": 0 00:23:08.085 }, 00:23:08.085 "claimed": true, 00:23:08.085 "claim_type": "exclusive_write", 00:23:08.085 "zoned": false, 00:23:08.085 "supported_io_types": { 00:23:08.085 "read": true, 00:23:08.085 "write": true, 00:23:08.085 "unmap": true, 00:23:08.085 "write_zeroes": true, 00:23:08.085 "flush": true, 00:23:08.085 "reset": true, 00:23:08.085 "compare": false, 00:23:08.085 "compare_and_write": false, 00:23:08.085 "abort": true, 00:23:08.085 "nvme_admin": false, 00:23:08.085 "nvme_io": false 00:23:08.085 }, 00:23:08.085 "memory_domains": [ 00:23:08.085 { 00:23:08.085 "dma_device_id": "system", 00:23:08.085 "dma_device_type": 1 00:23:08.085 }, 00:23:08.085 { 00:23:08.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.085 "dma_device_type": 2 00:23:08.085 } 00:23:08.085 ], 00:23:08.085 "driver_specific": {} 00:23:08.085 }' 00:23:08.085 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:08.085 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:08.343 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:08.600 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:08.600 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:08.600 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:08.600 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:08.600 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:08.857 "name": "BaseBdev2", 00:23:08.857 "aliases": [ 00:23:08.857 "22c298f3-9868-4245-96cd-448a9608072a" 00:23:08.857 ], 00:23:08.857 "product_name": "Malloc disk", 00:23:08.857 "block_size": 4096, 00:23:08.857 "num_blocks": 8192, 00:23:08.857 "uuid": "22c298f3-9868-4245-96cd-448a9608072a", 00:23:08.857 "assigned_rate_limits": { 00:23:08.857 "rw_ios_per_sec": 0, 00:23:08.857 "rw_mbytes_per_sec": 0, 00:23:08.857 "r_mbytes_per_sec": 0, 00:23:08.857 "w_mbytes_per_sec": 0 00:23:08.857 }, 00:23:08.857 "claimed": true, 00:23:08.857 "claim_type": "exclusive_write", 00:23:08.857 "zoned": false, 00:23:08.857 "supported_io_types": { 00:23:08.857 "read": true, 00:23:08.857 "write": true, 00:23:08.857 "unmap": true, 00:23:08.857 "write_zeroes": true, 00:23:08.857 "flush": true, 00:23:08.857 "reset": true, 00:23:08.857 "compare": false, 00:23:08.857 "compare_and_write": false, 00:23:08.857 "abort": true, 00:23:08.857 "nvme_admin": false, 00:23:08.857 "nvme_io": false 00:23:08.857 }, 00:23:08.857 "memory_domains": [ 00:23:08.857 { 00:23:08.857 "dma_device_id": "system", 00:23:08.857 "dma_device_type": 1 00:23:08.857 }, 00:23:08.857 { 00:23:08.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.857 "dma_device_type": 2 00:23:08.857 } 00:23:08.857 ], 00:23:08.857 "driver_specific": {} 00:23:08.857 }' 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:08.857 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:09.115 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:09.115 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:09.115 11:59:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:09.115 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:09.115 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:09.115 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:09.372 [2024-05-14 11:59:36.295974] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # local expected_state 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:09.372 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.373 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:09.630 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:09.630 "name": "Existed_Raid", 00:23:09.630 "uuid": "1bb2fe66-11e5-42ed-8dc3-f82b0786aa18", 00:23:09.630 "strip_size_kb": 0, 00:23:09.630 "state": "online", 00:23:09.630 "raid_level": "raid1", 00:23:09.630 "superblock": true, 00:23:09.630 "num_base_bdevs": 2, 00:23:09.630 "num_base_bdevs_discovered": 1, 00:23:09.630 "num_base_bdevs_operational": 1, 00:23:09.630 "base_bdevs_list": [ 00:23:09.630 { 00:23:09.630 "name": null, 00:23:09.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.630 "is_configured": false, 00:23:09.630 "data_offset": 256, 00:23:09.630 "data_size": 7936 00:23:09.630 }, 00:23:09.630 { 00:23:09.630 "name": "BaseBdev2", 00:23:09.630 "uuid": "22c298f3-9868-4245-96cd-448a9608072a", 00:23:09.630 "is_configured": true, 00:23:09.630 "data_offset": 256, 00:23:09.630 "data_size": 7936 00:23:09.630 } 00:23:09.630 ] 00:23:09.630 }' 00:23:09.630 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:09.630 11:59:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:10.195 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:23:10.196 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:10.196 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.196 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:23:10.454 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:23:10.454 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:10.454 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:10.712 [2024-05-14 11:59:37.644592] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:10.712 [2024-05-14 11:59:37.644663] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:10.712 [2024-05-14 11:59:37.655521] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:10.712 [2024-05-14 11:59:37.655585] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:10.712 [2024-05-14 11:59:37.655598] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200bff0 name Existed_Raid, state offline 00:23:10.712 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:23:10.712 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:10.712 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.712 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@342 -- # killprocess 1782299 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 1782299 ']' 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 1782299 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1782299 00:23:10.970 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:10.971 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:10.971 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1782299' 00:23:10.971 killing process with pid 1782299 00:23:10.971 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@965 -- # kill 1782299 00:23:10.971 [2024-05-14 11:59:37.975847] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:10.971 11:59:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@970 -- # wait 1782299 00:23:10.971 [2024-05-14 11:59:37.976830] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:11.229 11:59:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@344 -- # return 0 00:23:11.229 00:23:11.229 real 0m10.294s 00:23:11.229 user 0m18.282s 00:23:11.229 sys 0m1.910s 00:23:11.229 11:59:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:11.229 11:59:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:11.229 ************************************ 00:23:11.229 END TEST raid_state_function_test_sb_4k 00:23:11.229 ************************************ 00:23:11.229 11:59:38 bdev_raid -- bdev/bdev_raid.sh@845 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:23:11.229 11:59:38 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:23:11.229 11:59:38 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:11.229 11:59:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:11.229 ************************************ 00:23:11.229 START TEST raid_superblock_test_4k 00:23:11.229 ************************************ 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # raid_pid=1783916 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # waitforlisten 1783916 /var/tmp/spdk-raid.sock 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@827 -- # '[' -z 1783916 ']' 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:11.229 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:11.229 11:59:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:11.488 [2024-05-14 11:59:38.357229] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:23:11.488 [2024-05-14 11:59:38.357296] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1783916 ] 00:23:11.488 [2024-05-14 11:59:38.484812] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:11.746 [2024-05-14 11:59:38.592264] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:11.746 [2024-05-14 11:59:38.660094] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:11.746 [2024-05-14 11:59:38.660131] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # return 0 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:12.313 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:23:12.572 malloc1 00:23:12.572 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:12.831 [2024-05-14 11:59:39.750387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:12.831 [2024-05-14 11:59:39.750440] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.831 [2024-05-14 11:59:39.750462] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18692a0 00:23:12.831 [2024-05-14 11:59:39.750474] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.831 [2024-05-14 11:59:39.752175] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.831 [2024-05-14 11:59:39.752203] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:12.831 pt1 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:12.831 11:59:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:23:13.089 malloc2 00:23:13.089 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:13.347 [2024-05-14 11:59:40.249842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:13.347 [2024-05-14 11:59:40.249898] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.347 [2024-05-14 11:59:40.249923] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1c480 00:23:13.347 [2024-05-14 11:59:40.249936] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.347 [2024-05-14 11:59:40.251583] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.347 [2024-05-14 11:59:40.251619] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:13.347 pt2 00:23:13.347 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:13.347 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:13.347 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:13.606 [2024-05-14 11:59:40.490507] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:13.606 [2024-05-14 11:59:40.491896] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:13.606 [2024-05-14 11:59:40.492055] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a122c0 00:23:13.606 [2024-05-14 11:59:40.492068] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:13.606 [2024-05-14 11:59:40.492279] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1868f70 00:23:13.606 [2024-05-14 11:59:40.492449] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a122c0 00:23:13.606 [2024-05-14 11:59:40.492461] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a122c0 00:23:13.606 [2024-05-14 11:59:40.492571] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.606 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.607 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.865 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:13.865 "name": "raid_bdev1", 00:23:13.865 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:13.865 "strip_size_kb": 0, 00:23:13.865 "state": "online", 00:23:13.865 "raid_level": "raid1", 00:23:13.865 "superblock": true, 00:23:13.865 "num_base_bdevs": 2, 00:23:13.865 "num_base_bdevs_discovered": 2, 00:23:13.865 "num_base_bdevs_operational": 2, 00:23:13.865 "base_bdevs_list": [ 00:23:13.865 { 00:23:13.865 "name": "pt1", 00:23:13.865 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:13.865 "is_configured": true, 00:23:13.865 "data_offset": 256, 00:23:13.865 "data_size": 7936 00:23:13.865 }, 00:23:13.865 { 00:23:13.865 "name": "pt2", 00:23:13.865 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:13.865 "is_configured": true, 00:23:13.865 "data_offset": 256, 00:23:13.865 "data_size": 7936 00:23:13.865 } 00:23:13.865 ] 00:23:13.865 }' 00:23:13.865 11:59:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:13.865 11:59:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:14.431 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:14.690 [2024-05-14 11:59:41.565540] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:14.691 "name": "raid_bdev1", 00:23:14.691 "aliases": [ 00:23:14.691 "f0c36f2d-d818-41df-9b27-2c9f21a880e2" 00:23:14.691 ], 00:23:14.691 "product_name": "Raid Volume", 00:23:14.691 "block_size": 4096, 00:23:14.691 "num_blocks": 7936, 00:23:14.691 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:14.691 "assigned_rate_limits": { 00:23:14.691 "rw_ios_per_sec": 0, 00:23:14.691 "rw_mbytes_per_sec": 0, 00:23:14.691 "r_mbytes_per_sec": 0, 00:23:14.691 "w_mbytes_per_sec": 0 00:23:14.691 }, 00:23:14.691 "claimed": false, 00:23:14.691 "zoned": false, 00:23:14.691 "supported_io_types": { 00:23:14.691 "read": true, 00:23:14.691 "write": true, 00:23:14.691 "unmap": false, 00:23:14.691 "write_zeroes": true, 00:23:14.691 "flush": false, 00:23:14.691 "reset": true, 00:23:14.691 "compare": false, 00:23:14.691 "compare_and_write": false, 00:23:14.691 "abort": false, 00:23:14.691 "nvme_admin": false, 00:23:14.691 "nvme_io": false 00:23:14.691 }, 00:23:14.691 "memory_domains": [ 00:23:14.691 { 00:23:14.691 "dma_device_id": "system", 00:23:14.691 "dma_device_type": 1 00:23:14.691 }, 00:23:14.691 { 00:23:14.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.691 "dma_device_type": 2 00:23:14.691 }, 00:23:14.691 { 00:23:14.691 "dma_device_id": "system", 00:23:14.691 "dma_device_type": 1 00:23:14.691 }, 00:23:14.691 { 00:23:14.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.691 "dma_device_type": 2 00:23:14.691 } 00:23:14.691 ], 00:23:14.691 "driver_specific": { 00:23:14.691 "raid": { 00:23:14.691 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:14.691 "strip_size_kb": 0, 00:23:14.691 "state": "online", 00:23:14.691 "raid_level": "raid1", 00:23:14.691 "superblock": true, 00:23:14.691 "num_base_bdevs": 2, 00:23:14.691 "num_base_bdevs_discovered": 2, 00:23:14.691 "num_base_bdevs_operational": 2, 00:23:14.691 "base_bdevs_list": [ 00:23:14.691 { 00:23:14.691 "name": "pt1", 00:23:14.691 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:14.691 "is_configured": true, 00:23:14.691 "data_offset": 256, 00:23:14.691 "data_size": 7936 00:23:14.691 }, 00:23:14.691 { 00:23:14.691 "name": "pt2", 00:23:14.691 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:14.691 "is_configured": true, 00:23:14.691 "data_offset": 256, 00:23:14.691 "data_size": 7936 00:23:14.691 } 00:23:14.691 ] 00:23:14.691 } 00:23:14.691 } 00:23:14.691 }' 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:14.691 pt2' 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:14.691 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:14.949 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:14.949 "name": "pt1", 00:23:14.949 "aliases": [ 00:23:14.949 "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8" 00:23:14.949 ], 00:23:14.949 "product_name": "passthru", 00:23:14.949 "block_size": 4096, 00:23:14.949 "num_blocks": 8192, 00:23:14.949 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:14.949 "assigned_rate_limits": { 00:23:14.949 "rw_ios_per_sec": 0, 00:23:14.949 "rw_mbytes_per_sec": 0, 00:23:14.949 "r_mbytes_per_sec": 0, 00:23:14.949 "w_mbytes_per_sec": 0 00:23:14.949 }, 00:23:14.949 "claimed": true, 00:23:14.949 "claim_type": "exclusive_write", 00:23:14.949 "zoned": false, 00:23:14.949 "supported_io_types": { 00:23:14.949 "read": true, 00:23:14.949 "write": true, 00:23:14.949 "unmap": true, 00:23:14.949 "write_zeroes": true, 00:23:14.949 "flush": true, 00:23:14.949 "reset": true, 00:23:14.949 "compare": false, 00:23:14.949 "compare_and_write": false, 00:23:14.949 "abort": true, 00:23:14.949 "nvme_admin": false, 00:23:14.949 "nvme_io": false 00:23:14.949 }, 00:23:14.949 "memory_domains": [ 00:23:14.949 { 00:23:14.949 "dma_device_id": "system", 00:23:14.949 "dma_device_type": 1 00:23:14.949 }, 00:23:14.949 { 00:23:14.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.949 "dma_device_type": 2 00:23:14.949 } 00:23:14.949 ], 00:23:14.949 "driver_specific": { 00:23:14.949 "passthru": { 00:23:14.949 "name": "pt1", 00:23:14.949 "base_bdev_name": "malloc1" 00:23:14.949 } 00:23:14.949 } 00:23:14.949 }' 00:23:14.949 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:14.949 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:14.949 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:14.949 11:59:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:14.949 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:15.208 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:15.465 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:15.465 "name": "pt2", 00:23:15.465 "aliases": [ 00:23:15.465 "8023b931-21ef-5440-a14a-5e652c3da59e" 00:23:15.465 ], 00:23:15.465 "product_name": "passthru", 00:23:15.465 "block_size": 4096, 00:23:15.465 "num_blocks": 8192, 00:23:15.465 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:15.465 "assigned_rate_limits": { 00:23:15.465 "rw_ios_per_sec": 0, 00:23:15.465 "rw_mbytes_per_sec": 0, 00:23:15.465 "r_mbytes_per_sec": 0, 00:23:15.465 "w_mbytes_per_sec": 0 00:23:15.465 }, 00:23:15.465 "claimed": true, 00:23:15.465 "claim_type": "exclusive_write", 00:23:15.465 "zoned": false, 00:23:15.465 "supported_io_types": { 00:23:15.465 "read": true, 00:23:15.465 "write": true, 00:23:15.465 "unmap": true, 00:23:15.465 "write_zeroes": true, 00:23:15.465 "flush": true, 00:23:15.465 "reset": true, 00:23:15.465 "compare": false, 00:23:15.465 "compare_and_write": false, 00:23:15.465 "abort": true, 00:23:15.465 "nvme_admin": false, 00:23:15.465 "nvme_io": false 00:23:15.465 }, 00:23:15.465 "memory_domains": [ 00:23:15.465 { 00:23:15.465 "dma_device_id": "system", 00:23:15.465 "dma_device_type": 1 00:23:15.465 }, 00:23:15.465 { 00:23:15.465 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:15.465 "dma_device_type": 2 00:23:15.465 } 00:23:15.465 ], 00:23:15.465 "driver_specific": { 00:23:15.465 "passthru": { 00:23:15.465 "name": "pt2", 00:23:15.465 "base_bdev_name": "malloc2" 00:23:15.465 } 00:23:15.465 } 00:23:15.465 }' 00:23:15.465 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:15.465 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:15.723 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:15.981 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:15.981 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:15.981 11:59:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:23:15.981 [2024-05-14 11:59:43.045469] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:15.981 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=f0c36f2d-d818-41df-9b27-2c9f21a880e2 00:23:15.981 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # '[' -z f0c36f2d-d818-41df-9b27-2c9f21a880e2 ']' 00:23:15.981 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:16.239 [2024-05-14 11:59:43.285881] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:16.239 [2024-05-14 11:59:43.285898] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:16.239 [2024-05-14 11:59:43.285951] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:16.239 [2024-05-14 11:59:43.286002] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:16.239 [2024-05-14 11:59:43.286013] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a122c0 name raid_bdev1, state offline 00:23:16.239 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.239 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:23:16.497 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:23:16.497 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:23:16.497 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:16.497 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:16.755 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:16.755 11:59:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:17.012 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:17.012 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:17.270 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:23:17.270 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:17.270 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:23:17.270 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:17.271 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:17.529 [2024-05-14 11:59:44.505083] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:17.529 [2024-05-14 11:59:44.506516] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:17.529 [2024-05-14 11:59:44.506576] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:17.529 [2024-05-14 11:59:44.506616] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:17.529 [2024-05-14 11:59:44.506635] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:17.529 [2024-05-14 11:59:44.506645] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a12000 name raid_bdev1, state configuring 00:23:17.529 request: 00:23:17.529 { 00:23:17.529 "name": "raid_bdev1", 00:23:17.529 "raid_level": "raid1", 00:23:17.529 "base_bdevs": [ 00:23:17.529 "malloc1", 00:23:17.529 "malloc2" 00:23:17.529 ], 00:23:17.529 "superblock": false, 00:23:17.529 "method": "bdev_raid_create", 00:23:17.529 "req_id": 1 00:23:17.529 } 00:23:17.529 Got JSON-RPC error response 00:23:17.529 response: 00:23:17.529 { 00:23:17.529 "code": -17, 00:23:17.529 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:17.529 } 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.529 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:23:17.788 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:23:17.788 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:23:17.788 11:59:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:18.047 [2024-05-14 11:59:44.990470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:18.047 [2024-05-14 11:59:44.990514] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.047 [2024-05-14 11:59:44.990537] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x186ac40 00:23:18.047 [2024-05-14 11:59:44.990550] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.047 [2024-05-14 11:59:44.992192] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.047 [2024-05-14 11:59:44.992220] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:18.047 [2024-05-14 11:59:44.992288] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:23:18.047 [2024-05-14 11:59:44.992315] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:18.047 pt1 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.047 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.305 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:18.305 "name": "raid_bdev1", 00:23:18.305 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:18.305 "strip_size_kb": 0, 00:23:18.305 "state": "configuring", 00:23:18.305 "raid_level": "raid1", 00:23:18.305 "superblock": true, 00:23:18.305 "num_base_bdevs": 2, 00:23:18.305 "num_base_bdevs_discovered": 1, 00:23:18.305 "num_base_bdevs_operational": 2, 00:23:18.305 "base_bdevs_list": [ 00:23:18.305 { 00:23:18.305 "name": "pt1", 00:23:18.305 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:18.305 "is_configured": true, 00:23:18.305 "data_offset": 256, 00:23:18.305 "data_size": 7936 00:23:18.305 }, 00:23:18.305 { 00:23:18.305 "name": null, 00:23:18.305 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:18.305 "is_configured": false, 00:23:18.305 "data_offset": 256, 00:23:18.305 "data_size": 7936 00:23:18.305 } 00:23:18.305 ] 00:23:18.305 }' 00:23:18.305 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:18.305 11:59:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:18.870 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:23:18.870 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:23:18.870 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:18.870 11:59:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:19.128 [2024-05-14 11:59:46.073363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:19.128 [2024-05-14 11:59:46.073419] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:19.128 [2024-05-14 11:59:46.073439] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1864f30 00:23:19.128 [2024-05-14 11:59:46.073451] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:19.128 [2024-05-14 11:59:46.073800] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:19.128 [2024-05-14 11:59:46.073817] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:19.128 [2024-05-14 11:59:46.073881] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:23:19.128 [2024-05-14 11:59:46.073902] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:19.128 [2024-05-14 11:59:46.074000] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x185fbd0 00:23:19.128 [2024-05-14 11:59:46.074012] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:19.128 [2024-05-14 11:59:46.074178] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186a1d0 00:23:19.128 [2024-05-14 11:59:46.074304] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x185fbd0 00:23:19.128 [2024-05-14 11:59:46.074315] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x185fbd0 00:23:19.128 [2024-05-14 11:59:46.074426] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:19.128 pt2 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.129 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.387 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:19.387 "name": "raid_bdev1", 00:23:19.387 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:19.387 "strip_size_kb": 0, 00:23:19.387 "state": "online", 00:23:19.387 "raid_level": "raid1", 00:23:19.387 "superblock": true, 00:23:19.387 "num_base_bdevs": 2, 00:23:19.387 "num_base_bdevs_discovered": 2, 00:23:19.387 "num_base_bdevs_operational": 2, 00:23:19.387 "base_bdevs_list": [ 00:23:19.387 { 00:23:19.387 "name": "pt1", 00:23:19.387 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:19.387 "is_configured": true, 00:23:19.387 "data_offset": 256, 00:23:19.387 "data_size": 7936 00:23:19.387 }, 00:23:19.387 { 00:23:19.387 "name": "pt2", 00:23:19.387 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:19.387 "is_configured": true, 00:23:19.387 "data_offset": 256, 00:23:19.387 "data_size": 7936 00:23:19.387 } 00:23:19.387 ] 00:23:19.387 }' 00:23:19.387 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:19.387 11:59:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:19.952 11:59:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:20.213 [2024-05-14 11:59:47.136393] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:20.213 "name": "raid_bdev1", 00:23:20.213 "aliases": [ 00:23:20.213 "f0c36f2d-d818-41df-9b27-2c9f21a880e2" 00:23:20.213 ], 00:23:20.213 "product_name": "Raid Volume", 00:23:20.213 "block_size": 4096, 00:23:20.213 "num_blocks": 7936, 00:23:20.213 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:20.213 "assigned_rate_limits": { 00:23:20.213 "rw_ios_per_sec": 0, 00:23:20.213 "rw_mbytes_per_sec": 0, 00:23:20.213 "r_mbytes_per_sec": 0, 00:23:20.213 "w_mbytes_per_sec": 0 00:23:20.213 }, 00:23:20.213 "claimed": false, 00:23:20.213 "zoned": false, 00:23:20.213 "supported_io_types": { 00:23:20.213 "read": true, 00:23:20.213 "write": true, 00:23:20.213 "unmap": false, 00:23:20.213 "write_zeroes": true, 00:23:20.213 "flush": false, 00:23:20.213 "reset": true, 00:23:20.213 "compare": false, 00:23:20.213 "compare_and_write": false, 00:23:20.213 "abort": false, 00:23:20.213 "nvme_admin": false, 00:23:20.213 "nvme_io": false 00:23:20.213 }, 00:23:20.213 "memory_domains": [ 00:23:20.213 { 00:23:20.213 "dma_device_id": "system", 00:23:20.213 "dma_device_type": 1 00:23:20.213 }, 00:23:20.213 { 00:23:20.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:20.213 "dma_device_type": 2 00:23:20.213 }, 00:23:20.213 { 00:23:20.213 "dma_device_id": "system", 00:23:20.213 "dma_device_type": 1 00:23:20.213 }, 00:23:20.213 { 00:23:20.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:20.213 "dma_device_type": 2 00:23:20.213 } 00:23:20.213 ], 00:23:20.213 "driver_specific": { 00:23:20.213 "raid": { 00:23:20.213 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:20.213 "strip_size_kb": 0, 00:23:20.213 "state": "online", 00:23:20.213 "raid_level": "raid1", 00:23:20.213 "superblock": true, 00:23:20.213 "num_base_bdevs": 2, 00:23:20.213 "num_base_bdevs_discovered": 2, 00:23:20.213 "num_base_bdevs_operational": 2, 00:23:20.213 "base_bdevs_list": [ 00:23:20.213 { 00:23:20.213 "name": "pt1", 00:23:20.213 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:20.213 "is_configured": true, 00:23:20.213 "data_offset": 256, 00:23:20.213 "data_size": 7936 00:23:20.213 }, 00:23:20.213 { 00:23:20.213 "name": "pt2", 00:23:20.213 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:20.213 "is_configured": true, 00:23:20.213 "data_offset": 256, 00:23:20.213 "data_size": 7936 00:23:20.213 } 00:23:20.213 ] 00:23:20.213 } 00:23:20.213 } 00:23:20.213 }' 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:20.213 pt2' 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:20.213 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:20.551 "name": "pt1", 00:23:20.551 "aliases": [ 00:23:20.551 "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8" 00:23:20.551 ], 00:23:20.551 "product_name": "passthru", 00:23:20.551 "block_size": 4096, 00:23:20.551 "num_blocks": 8192, 00:23:20.551 "uuid": "adf9fa40-9717-52e8-8f2e-ffc8ebeaa7f8", 00:23:20.551 "assigned_rate_limits": { 00:23:20.551 "rw_ios_per_sec": 0, 00:23:20.551 "rw_mbytes_per_sec": 0, 00:23:20.551 "r_mbytes_per_sec": 0, 00:23:20.551 "w_mbytes_per_sec": 0 00:23:20.551 }, 00:23:20.551 "claimed": true, 00:23:20.551 "claim_type": "exclusive_write", 00:23:20.551 "zoned": false, 00:23:20.551 "supported_io_types": { 00:23:20.551 "read": true, 00:23:20.551 "write": true, 00:23:20.551 "unmap": true, 00:23:20.551 "write_zeroes": true, 00:23:20.551 "flush": true, 00:23:20.551 "reset": true, 00:23:20.551 "compare": false, 00:23:20.551 "compare_and_write": false, 00:23:20.551 "abort": true, 00:23:20.551 "nvme_admin": false, 00:23:20.551 "nvme_io": false 00:23:20.551 }, 00:23:20.551 "memory_domains": [ 00:23:20.551 { 00:23:20.551 "dma_device_id": "system", 00:23:20.551 "dma_device_type": 1 00:23:20.551 }, 00:23:20.551 { 00:23:20.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:20.551 "dma_device_type": 2 00:23:20.551 } 00:23:20.551 ], 00:23:20.551 "driver_specific": { 00:23:20.551 "passthru": { 00:23:20.551 "name": "pt1", 00:23:20.551 "base_bdev_name": "malloc1" 00:23:20.551 } 00:23:20.551 } 00:23:20.551 }' 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:20.551 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:20.808 11:59:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:21.066 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:21.066 "name": "pt2", 00:23:21.066 "aliases": [ 00:23:21.066 "8023b931-21ef-5440-a14a-5e652c3da59e" 00:23:21.066 ], 00:23:21.066 "product_name": "passthru", 00:23:21.066 "block_size": 4096, 00:23:21.066 "num_blocks": 8192, 00:23:21.066 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:21.066 "assigned_rate_limits": { 00:23:21.066 "rw_ios_per_sec": 0, 00:23:21.066 "rw_mbytes_per_sec": 0, 00:23:21.066 "r_mbytes_per_sec": 0, 00:23:21.066 "w_mbytes_per_sec": 0 00:23:21.066 }, 00:23:21.066 "claimed": true, 00:23:21.066 "claim_type": "exclusive_write", 00:23:21.066 "zoned": false, 00:23:21.066 "supported_io_types": { 00:23:21.066 "read": true, 00:23:21.066 "write": true, 00:23:21.066 "unmap": true, 00:23:21.066 "write_zeroes": true, 00:23:21.066 "flush": true, 00:23:21.066 "reset": true, 00:23:21.066 "compare": false, 00:23:21.066 "compare_and_write": false, 00:23:21.066 "abort": true, 00:23:21.066 "nvme_admin": false, 00:23:21.066 "nvme_io": false 00:23:21.066 }, 00:23:21.066 "memory_domains": [ 00:23:21.066 { 00:23:21.066 "dma_device_id": "system", 00:23:21.066 "dma_device_type": 1 00:23:21.066 }, 00:23:21.066 { 00:23:21.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:21.066 "dma_device_type": 2 00:23:21.066 } 00:23:21.066 ], 00:23:21.066 "driver_specific": { 00:23:21.066 "passthru": { 00:23:21.066 "name": "pt2", 00:23:21.066 "base_bdev_name": "malloc2" 00:23:21.066 } 00:23:21.066 } 00:23:21.066 }' 00:23:21.066 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:21.066 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:21.066 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:23:21.066 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:21.324 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:23:21.582 [2024-05-14 11:59:48.612317] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:21.582 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # '[' f0c36f2d-d818-41df-9b27-2c9f21a880e2 '!=' f0c36f2d-d818-41df-9b27-2c9f21a880e2 ']' 00:23:21.582 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:23:21.582 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:21.582 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:23:21.582 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:21.840 [2024-05-14 11:59:48.856784] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.840 11:59:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.099 11:59:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:22.099 "name": "raid_bdev1", 00:23:22.099 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:22.099 "strip_size_kb": 0, 00:23:22.099 "state": "online", 00:23:22.099 "raid_level": "raid1", 00:23:22.099 "superblock": true, 00:23:22.099 "num_base_bdevs": 2, 00:23:22.099 "num_base_bdevs_discovered": 1, 00:23:22.099 "num_base_bdevs_operational": 1, 00:23:22.099 "base_bdevs_list": [ 00:23:22.099 { 00:23:22.099 "name": null, 00:23:22.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.099 "is_configured": false, 00:23:22.099 "data_offset": 256, 00:23:22.099 "data_size": 7936 00:23:22.099 }, 00:23:22.099 { 00:23:22.099 "name": "pt2", 00:23:22.099 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:22.099 "is_configured": true, 00:23:22.099 "data_offset": 256, 00:23:22.099 "data_size": 7936 00:23:22.099 } 00:23:22.099 ] 00:23:22.099 }' 00:23:22.099 11:59:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:22.099 11:59:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:22.664 11:59:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:22.922 [2024-05-14 11:59:49.939634] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:22.922 [2024-05-14 11:59:49.939663] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:22.922 [2024-05-14 11:59:49.939721] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:22.922 [2024-05-14 11:59:49.939765] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:22.922 [2024-05-14 11:59:49.939777] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x185fbd0 name raid_bdev1, state offline 00:23:22.922 11:59:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.922 11:59:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:23:23.180 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:23:23.180 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:23:23.180 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:23:23.180 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:23:23.180 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # i=1 00:23:23.438 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:23.696 [2024-05-14 11:59:50.681567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:23.696 [2024-05-14 11:59:50.681617] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.696 [2024-05-14 11:59:50.681636] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1865160 00:23:23.696 [2024-05-14 11:59:50.681649] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.696 [2024-05-14 11:59:50.683318] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.696 [2024-05-14 11:59:50.683353] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:23.696 [2024-05-14 11:59:50.683428] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:23:23.696 [2024-05-14 11:59:50.683456] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:23.696 [2024-05-14 11:59:50.683548] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1863900 00:23:23.696 [2024-05-14 11:59:50.683559] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:23.696 [2024-05-14 11:59:50.683738] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186a250 00:23:23.696 [2024-05-14 11:59:50.683874] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1863900 00:23:23.696 [2024-05-14 11:59:50.683885] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1863900 00:23:23.696 [2024-05-14 11:59:50.683986] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.696 pt2 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.696 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.954 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:23.954 "name": "raid_bdev1", 00:23:23.954 "uuid": "f0c36f2d-d818-41df-9b27-2c9f21a880e2", 00:23:23.954 "strip_size_kb": 0, 00:23:23.954 "state": "online", 00:23:23.954 "raid_level": "raid1", 00:23:23.954 "superblock": true, 00:23:23.954 "num_base_bdevs": 2, 00:23:23.954 "num_base_bdevs_discovered": 1, 00:23:23.954 "num_base_bdevs_operational": 1, 00:23:23.954 "base_bdevs_list": [ 00:23:23.954 { 00:23:23.954 "name": null, 00:23:23.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:23.954 "is_configured": false, 00:23:23.954 "data_offset": 256, 00:23:23.954 "data_size": 7936 00:23:23.954 }, 00:23:23.954 { 00:23:23.954 "name": "pt2", 00:23:23.954 "uuid": "8023b931-21ef-5440-a14a-5e652c3da59e", 00:23:23.954 "is_configured": true, 00:23:23.954 "data_offset": 256, 00:23:23.954 "data_size": 7936 00:23:23.954 } 00:23:23.954 ] 00:23:23.954 }' 00:23:23.954 11:59:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:23.954 11:59:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:24.521 11:59:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:23:24.521 11:59:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:24.521 11:59:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:23:24.779 [2024-05-14 11:59:51.752595] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # '[' f0c36f2d-d818-41df-9b27-2c9f21a880e2 '!=' f0c36f2d-d818-41df-9b27-2c9f21a880e2 ']' 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@568 -- # killprocess 1783916 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@946 -- # '[' -z 1783916 ']' 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # kill -0 1783916 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # uname 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1783916 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1783916' 00:23:24.779 killing process with pid 1783916 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@965 -- # kill 1783916 00:23:24.779 [2024-05-14 11:59:51.822624] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:24.779 [2024-05-14 11:59:51.822684] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:24.779 [2024-05-14 11:59:51.822731] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:24.779 [2024-05-14 11:59:51.822742] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1863900 name raid_bdev1, state offline 00:23:24.779 11:59:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@970 -- # wait 1783916 00:23:24.779 [2024-05-14 11:59:51.839031] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:25.037 11:59:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # return 0 00:23:25.037 00:23:25.037 real 0m13.748s 00:23:25.037 user 0m24.763s 00:23:25.037 sys 0m2.569s 00:23:25.037 11:59:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:25.037 11:59:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:25.037 ************************************ 00:23:25.037 END TEST raid_superblock_test_4k 00:23:25.037 ************************************ 00:23:25.037 11:59:52 bdev_raid -- bdev/bdev_raid.sh@846 -- # '[' true = true ']' 00:23:25.037 11:59:52 bdev_raid -- bdev/bdev_raid.sh@847 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:23:25.037 11:59:52 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:23:25.037 11:59:52 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:25.037 11:59:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:25.037 ************************************ 00:23:25.037 START TEST raid_rebuild_test_sb_4k 00:23:25.037 ************************************ 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local verify=true 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:23:25.037 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # local strip_size 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@582 -- # local create_arg 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local data_offset 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # raid_pid=1785952 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@603 -- # waitforlisten 1785952 /var/tmp/spdk-raid.sock 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 1785952 ']' 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:25.295 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:25.295 11:59:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:25.295 [2024-05-14 11:59:52.182866] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:23:25.295 [2024-05-14 11:59:52.182927] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1785952 ] 00:23:25.295 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:25.295 Zero copy mechanism will not be used. 00:23:25.295 [2024-05-14 11:59:52.310508] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.553 [2024-05-14 11:59:52.416866] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.553 [2024-05-14 11:59:52.491805] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:25.553 [2024-05-14 11:59:52.491844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:26.119 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:26.119 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:23:26.119 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:26.119 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:23:26.378 BaseBdev1_malloc 00:23:26.378 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:26.636 [2024-05-14 11:59:53.568209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:26.636 [2024-05-14 11:59:53.568255] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.636 [2024-05-14 11:59:53.568277] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b10960 00:23:26.636 [2024-05-14 11:59:53.568290] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.636 [2024-05-14 11:59:53.570022] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.636 [2024-05-14 11:59:53.570055] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:26.636 BaseBdev1 00:23:26.636 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:23:26.636 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:23:26.894 BaseBdev2_malloc 00:23:26.894 11:59:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:27.153 [2024-05-14 11:59:54.058344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:27.153 [2024-05-14 11:59:54.058389] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.153 [2024-05-14 11:59:54.058416] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc3b40 00:23:27.153 [2024-05-14 11:59:54.058429] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.153 [2024-05-14 11:59:54.059978] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.153 [2024-05-14 11:59:54.060006] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:27.153 BaseBdev2 00:23:27.153 11:59:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:23:27.411 spare_malloc 00:23:27.411 11:59:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:27.668 spare_delay 00:23:27.668 11:59:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:27.926 [2024-05-14 11:59:54.797207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:27.926 [2024-05-14 11:59:54.797247] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.926 [2024-05-14 11:59:54.797266] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b0c700 00:23:27.926 [2024-05-14 11:59:54.797278] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.926 [2024-05-14 11:59:54.798709] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.926 [2024-05-14 11:59:54.798737] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:27.926 spare 00:23:27.926 11:59:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:28.184 [2024-05-14 11:59:55.049906] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:28.184 [2024-05-14 11:59:55.051335] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:28.184 [2024-05-14 11:59:55.051507] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b0ba50 00:23:28.184 [2024-05-14 11:59:55.051522] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:28.184 [2024-05-14 11:59:55.051724] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b092d0 00:23:28.184 [2024-05-14 11:59:55.051871] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b0ba50 00:23:28.185 [2024-05-14 11:59:55.051881] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b0ba50 00:23:28.185 [2024-05-14 11:59:55.051983] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.185 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.443 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:28.443 "name": "raid_bdev1", 00:23:28.443 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:28.443 "strip_size_kb": 0, 00:23:28.443 "state": "online", 00:23:28.443 "raid_level": "raid1", 00:23:28.443 "superblock": true, 00:23:28.443 "num_base_bdevs": 2, 00:23:28.443 "num_base_bdevs_discovered": 2, 00:23:28.443 "num_base_bdevs_operational": 2, 00:23:28.443 "base_bdevs_list": [ 00:23:28.443 { 00:23:28.443 "name": "BaseBdev1", 00:23:28.443 "uuid": "053a4c95-0d2f-5cc7-86cc-e0833677b5b6", 00:23:28.443 "is_configured": true, 00:23:28.443 "data_offset": 256, 00:23:28.443 "data_size": 7936 00:23:28.443 }, 00:23:28.443 { 00:23:28.443 "name": "BaseBdev2", 00:23:28.443 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:28.443 "is_configured": true, 00:23:28.443 "data_offset": 256, 00:23:28.443 "data_size": 7936 00:23:28.443 } 00:23:28.443 ] 00:23:28.443 }' 00:23:28.443 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:28.443 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:29.009 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:29.009 11:59:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:23:29.267 [2024-05-14 11:59:56.120923] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:29.267 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:23:29.267 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.267 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:29.525 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:29.526 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:29.526 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:29.526 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:29.526 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:29.526 [2024-05-14 11:59:56.598003] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b0b960 00:23:29.526 /dev/nbd0 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:29.784 1+0 records in 00:23:29.784 1+0 records out 00:23:29.784 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287003 s, 14.3 MB/s 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:23:29.784 11:59:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:23:30.350 7936+0 records in 00:23:30.350 7936+0 records out 00:23:30.350 32505856 bytes (33 MB, 31 MiB) copied, 0.760526 s, 42.7 MB/s 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:30.350 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:30.608 [2024-05-14 11:59:57.685574] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.608 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:30.608 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:30.608 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:30.608 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:30.866 [2024-05-14 11:59:57.920388] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:30.866 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:30.867 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.867 11:59:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.125 11:59:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:31.125 "name": "raid_bdev1", 00:23:31.125 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:31.125 "strip_size_kb": 0, 00:23:31.125 "state": "online", 00:23:31.125 "raid_level": "raid1", 00:23:31.125 "superblock": true, 00:23:31.125 "num_base_bdevs": 2, 00:23:31.125 "num_base_bdevs_discovered": 1, 00:23:31.125 "num_base_bdevs_operational": 1, 00:23:31.125 "base_bdevs_list": [ 00:23:31.125 { 00:23:31.125 "name": null, 00:23:31.125 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.125 "is_configured": false, 00:23:31.125 "data_offset": 256, 00:23:31.125 "data_size": 7936 00:23:31.125 }, 00:23:31.125 { 00:23:31.125 "name": "BaseBdev2", 00:23:31.125 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:31.125 "is_configured": true, 00:23:31.125 "data_offset": 256, 00:23:31.125 "data_size": 7936 00:23:31.125 } 00:23:31.125 ] 00:23:31.125 }' 00:23:31.125 11:59:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:31.125 11:59:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:31.691 11:59:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:31.948 [2024-05-14 11:59:58.927067] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:31.948 [2024-05-14 11:59:58.932039] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b092d0 00:23:31.948 [2024-05-14 11:59:58.934249] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:31.948 11:59:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # sleep 1 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.883 11:59:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.141 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:33.141 "name": "raid_bdev1", 00:23:33.141 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:33.141 "strip_size_kb": 0, 00:23:33.141 "state": "online", 00:23:33.141 "raid_level": "raid1", 00:23:33.141 "superblock": true, 00:23:33.141 "num_base_bdevs": 2, 00:23:33.141 "num_base_bdevs_discovered": 2, 00:23:33.141 "num_base_bdevs_operational": 2, 00:23:33.141 "process": { 00:23:33.141 "type": "rebuild", 00:23:33.141 "target": "spare", 00:23:33.141 "progress": { 00:23:33.141 "blocks": 3072, 00:23:33.141 "percent": 38 00:23:33.141 } 00:23:33.141 }, 00:23:33.141 "base_bdevs_list": [ 00:23:33.141 { 00:23:33.141 "name": "spare", 00:23:33.141 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:33.141 "is_configured": true, 00:23:33.141 "data_offset": 256, 00:23:33.141 "data_size": 7936 00:23:33.141 }, 00:23:33.141 { 00:23:33.141 "name": "BaseBdev2", 00:23:33.141 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:33.141 "is_configured": true, 00:23:33.141 "data_offset": 256, 00:23:33.141 "data_size": 7936 00:23:33.141 } 00:23:33.141 ] 00:23:33.141 }' 00:23:33.141 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:33.399 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.399 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:33.399 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.399 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:33.657 [2024-05-14 12:00:00.521220] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.657 [2024-05-14 12:00:00.547242] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:33.657 [2024-05-14 12:00:00.547288] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:33.657 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:33.658 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:33.658 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.658 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.916 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:33.916 "name": "raid_bdev1", 00:23:33.916 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:33.916 "strip_size_kb": 0, 00:23:33.916 "state": "online", 00:23:33.916 "raid_level": "raid1", 00:23:33.916 "superblock": true, 00:23:33.916 "num_base_bdevs": 2, 00:23:33.916 "num_base_bdevs_discovered": 1, 00:23:33.916 "num_base_bdevs_operational": 1, 00:23:33.916 "base_bdevs_list": [ 00:23:33.916 { 00:23:33.916 "name": null, 00:23:33.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.916 "is_configured": false, 00:23:33.916 "data_offset": 256, 00:23:33.916 "data_size": 7936 00:23:33.916 }, 00:23:33.916 { 00:23:33.916 "name": "BaseBdev2", 00:23:33.916 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:33.916 "is_configured": true, 00:23:33.916 "data_offset": 256, 00:23:33.916 "data_size": 7936 00:23:33.916 } 00:23:33.916 ] 00:23:33.916 }' 00:23:33.916 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:33.916 12:00:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.478 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:34.734 "name": "raid_bdev1", 00:23:34.734 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:34.734 "strip_size_kb": 0, 00:23:34.734 "state": "online", 00:23:34.734 "raid_level": "raid1", 00:23:34.734 "superblock": true, 00:23:34.734 "num_base_bdevs": 2, 00:23:34.734 "num_base_bdevs_discovered": 1, 00:23:34.734 "num_base_bdevs_operational": 1, 00:23:34.734 "base_bdevs_list": [ 00:23:34.734 { 00:23:34.734 "name": null, 00:23:34.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.734 "is_configured": false, 00:23:34.734 "data_offset": 256, 00:23:34.734 "data_size": 7936 00:23:34.734 }, 00:23:34.734 { 00:23:34.734 "name": "BaseBdev2", 00:23:34.734 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:34.734 "is_configured": true, 00:23:34.734 "data_offset": 256, 00:23:34.734 "data_size": 7936 00:23:34.734 } 00:23:34.734 ] 00:23:34.734 }' 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:34.734 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:34.989 [2024-05-14 12:00:01.979659] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:34.989 [2024-05-14 12:00:01.984580] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b08260 00:23:34.989 [2024-05-14 12:00:01.986047] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:34.989 12:00:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # sleep 1 00:23:35.920 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.920 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:35.920 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:35.920 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:35.920 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:36.178 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.178 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.178 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:36.178 "name": "raid_bdev1", 00:23:36.178 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:36.178 "strip_size_kb": 0, 00:23:36.178 "state": "online", 00:23:36.178 "raid_level": "raid1", 00:23:36.178 "superblock": true, 00:23:36.178 "num_base_bdevs": 2, 00:23:36.178 "num_base_bdevs_discovered": 2, 00:23:36.178 "num_base_bdevs_operational": 2, 00:23:36.178 "process": { 00:23:36.178 "type": "rebuild", 00:23:36.178 "target": "spare", 00:23:36.178 "progress": { 00:23:36.178 "blocks": 3072, 00:23:36.178 "percent": 38 00:23:36.178 } 00:23:36.178 }, 00:23:36.178 "base_bdevs_list": [ 00:23:36.178 { 00:23:36.178 "name": "spare", 00:23:36.178 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:36.178 "is_configured": true, 00:23:36.178 "data_offset": 256, 00:23:36.178 "data_size": 7936 00:23:36.178 }, 00:23:36.178 { 00:23:36.178 "name": "BaseBdev2", 00:23:36.178 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:36.178 "is_configured": true, 00:23:36.178 "data_offset": 256, 00:23:36.178 "data_size": 7936 00:23:36.178 } 00:23:36.178 ] 00:23:36.178 }' 00:23:36.178 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:23:36.436 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@711 -- # local timeout=859 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.436 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:36.703 "name": "raid_bdev1", 00:23:36.703 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:36.703 "strip_size_kb": 0, 00:23:36.703 "state": "online", 00:23:36.703 "raid_level": "raid1", 00:23:36.703 "superblock": true, 00:23:36.703 "num_base_bdevs": 2, 00:23:36.703 "num_base_bdevs_discovered": 2, 00:23:36.703 "num_base_bdevs_operational": 2, 00:23:36.703 "process": { 00:23:36.703 "type": "rebuild", 00:23:36.703 "target": "spare", 00:23:36.703 "progress": { 00:23:36.703 "blocks": 3840, 00:23:36.703 "percent": 48 00:23:36.703 } 00:23:36.703 }, 00:23:36.703 "base_bdevs_list": [ 00:23:36.703 { 00:23:36.703 "name": "spare", 00:23:36.703 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:36.703 "is_configured": true, 00:23:36.703 "data_offset": 256, 00:23:36.703 "data_size": 7936 00:23:36.703 }, 00:23:36.703 { 00:23:36.703 "name": "BaseBdev2", 00:23:36.703 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:36.703 "is_configured": true, 00:23:36.703 "data_offset": 256, 00:23:36.703 "data_size": 7936 00:23:36.703 } 00:23:36.703 ] 00:23:36.703 }' 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.703 12:00:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@716 -- # sleep 1 00:23:37.689 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:37.689 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:37.689 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:37.689 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:37.690 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:37.690 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:37.690 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.690 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.948 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:37.948 "name": "raid_bdev1", 00:23:37.948 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:37.948 "strip_size_kb": 0, 00:23:37.948 "state": "online", 00:23:37.948 "raid_level": "raid1", 00:23:37.948 "superblock": true, 00:23:37.948 "num_base_bdevs": 2, 00:23:37.948 "num_base_bdevs_discovered": 2, 00:23:37.948 "num_base_bdevs_operational": 2, 00:23:37.948 "process": { 00:23:37.948 "type": "rebuild", 00:23:37.948 "target": "spare", 00:23:37.948 "progress": { 00:23:37.948 "blocks": 7168, 00:23:37.948 "percent": 90 00:23:37.948 } 00:23:37.948 }, 00:23:37.948 "base_bdevs_list": [ 00:23:37.948 { 00:23:37.948 "name": "spare", 00:23:37.948 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:37.948 "is_configured": true, 00:23:37.948 "data_offset": 256, 00:23:37.948 "data_size": 7936 00:23:37.948 }, 00:23:37.948 { 00:23:37.948 "name": "BaseBdev2", 00:23:37.948 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:37.948 "is_configured": true, 00:23:37.948 "data_offset": 256, 00:23:37.948 "data_size": 7936 00:23:37.948 } 00:23:37.948 ] 00:23:37.948 }' 00:23:37.948 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:37.948 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.948 12:00:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:37.948 12:00:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.948 12:00:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@716 -- # sleep 1 00:23:38.207 [2024-05-14 12:00:05.109704] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.207 [2024-05-14 12:00:05.109759] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.207 [2024-05-14 12:00:05.109841] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.138 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.393 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:39.393 "name": "raid_bdev1", 00:23:39.393 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:39.394 "strip_size_kb": 0, 00:23:39.394 "state": "online", 00:23:39.394 "raid_level": "raid1", 00:23:39.394 "superblock": true, 00:23:39.394 "num_base_bdevs": 2, 00:23:39.394 "num_base_bdevs_discovered": 2, 00:23:39.394 "num_base_bdevs_operational": 2, 00:23:39.394 "base_bdevs_list": [ 00:23:39.394 { 00:23:39.394 "name": "spare", 00:23:39.394 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:39.394 "is_configured": true, 00:23:39.394 "data_offset": 256, 00:23:39.394 "data_size": 7936 00:23:39.394 }, 00:23:39.394 { 00:23:39.394 "name": "BaseBdev2", 00:23:39.394 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:39.394 "is_configured": true, 00:23:39.394 "data_offset": 256, 00:23:39.394 "data_size": 7936 00:23:39.394 } 00:23:39.394 ] 00:23:39.394 }' 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # break 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.394 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:39.650 "name": "raid_bdev1", 00:23:39.650 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:39.650 "strip_size_kb": 0, 00:23:39.650 "state": "online", 00:23:39.650 "raid_level": "raid1", 00:23:39.650 "superblock": true, 00:23:39.650 "num_base_bdevs": 2, 00:23:39.650 "num_base_bdevs_discovered": 2, 00:23:39.650 "num_base_bdevs_operational": 2, 00:23:39.650 "base_bdevs_list": [ 00:23:39.650 { 00:23:39.650 "name": "spare", 00:23:39.650 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:39.650 "is_configured": true, 00:23:39.650 "data_offset": 256, 00:23:39.650 "data_size": 7936 00:23:39.650 }, 00:23:39.650 { 00:23:39.650 "name": "BaseBdev2", 00:23:39.650 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:39.650 "is_configured": true, 00:23:39.650 "data_offset": 256, 00:23:39.650 "data_size": 7936 00:23:39.650 } 00:23:39.650 ] 00:23:39.650 }' 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.650 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.907 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:39.907 "name": "raid_bdev1", 00:23:39.907 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:39.907 "strip_size_kb": 0, 00:23:39.907 "state": "online", 00:23:39.907 "raid_level": "raid1", 00:23:39.907 "superblock": true, 00:23:39.907 "num_base_bdevs": 2, 00:23:39.907 "num_base_bdevs_discovered": 2, 00:23:39.907 "num_base_bdevs_operational": 2, 00:23:39.907 "base_bdevs_list": [ 00:23:39.907 { 00:23:39.907 "name": "spare", 00:23:39.907 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:39.907 "is_configured": true, 00:23:39.907 "data_offset": 256, 00:23:39.907 "data_size": 7936 00:23:39.907 }, 00:23:39.907 { 00:23:39.907 "name": "BaseBdev2", 00:23:39.907 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:39.907 "is_configured": true, 00:23:39.907 "data_offset": 256, 00:23:39.907 "data_size": 7936 00:23:39.907 } 00:23:39.907 ] 00:23:39.907 }' 00:23:39.907 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:39.907 12:00:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:40.471 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:40.729 [2024-05-14 12:00:07.762120] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.729 [2024-05-14 12:00:07.762149] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:40.729 [2024-05-14 12:00:07.762211] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.729 [2024-05-14 12:00:07.762267] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.729 [2024-05-14 12:00:07.762279] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b0ba50 name raid_bdev1, state offline 00:23:40.729 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.729 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # jq length 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:40.987 12:00:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:41.245 /dev/nbd0 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.245 1+0 records in 00:23:41.245 1+0 records out 00:23:41.245 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239922 s, 17.1 MB/s 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:41.245 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:41.502 /dev/nbd1 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.502 1+0 records in 00:23:41.502 1+0 records out 00:23:41.502 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294727 s, 13.9 MB/s 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:41.502 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:41.760 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.018 12:00:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:23:42.275 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:42.534 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:42.534 [2024-05-14 12:00:09.606333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:42.534 [2024-05-14 12:00:09.606381] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.534 [2024-05-14 12:00:09.606413] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b0eed0 00:23:42.534 [2024-05-14 12:00:09.606427] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.534 [2024-05-14 12:00:09.608028] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.534 [2024-05-14 12:00:09.608057] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:42.534 [2024-05-14 12:00:09.608120] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:42.534 [2024-05-14 12:00:09.608147] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:42.534 BaseBdev1 00:23:42.793 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:23:42.793 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:23:42.793 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:23:42.793 12:00:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:43.051 [2024-05-14 12:00:10.099664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:43.051 [2024-05-14 12:00:10.099715] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.051 [2024-05-14 12:00:10.099738] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b0f1d0 00:23:43.051 [2024-05-14 12:00:10.099750] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.051 [2024-05-14 12:00:10.100089] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.051 [2024-05-14 12:00:10.100107] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:43.051 [2024-05-14 12:00:10.100168] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:23:43.051 [2024-05-14 12:00:10.100180] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:23:43.051 [2024-05-14 12:00:10.100190] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:43.051 [2024-05-14 12:00:10.100205] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cba5d0 name raid_bdev1, state configuring 00:23:43.051 [2024-05-14 12:00:10.100235] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.051 BaseBdev2 00:23:43.051 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:43.308 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.566 [2024-05-14 12:00:10.592961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.566 [2024-05-14 12:00:10.593004] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.566 [2024-05-14 12:00:10.593023] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cb97e0 00:23:43.566 [2024-05-14 12:00:10.593035] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.566 [2024-05-14 12:00:10.593384] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.566 [2024-05-14 12:00:10.593408] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.566 [2024-05-14 12:00:10.593484] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:23:43.566 [2024-05-14 12:00:10.593502] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.566 spare 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.566 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.824 [2024-05-14 12:00:10.693825] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b07a70 00:23:43.824 [2024-05-14 12:00:10.693841] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:43.824 [2024-05-14 12:00:10.694036] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b0b300 00:23:43.824 [2024-05-14 12:00:10.694189] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b07a70 00:23:43.824 [2024-05-14 12:00:10.694199] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b07a70 00:23:43.824 [2024-05-14 12:00:10.694307] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.824 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:43.824 "name": "raid_bdev1", 00:23:43.824 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:43.824 "strip_size_kb": 0, 00:23:43.824 "state": "online", 00:23:43.824 "raid_level": "raid1", 00:23:43.824 "superblock": true, 00:23:43.824 "num_base_bdevs": 2, 00:23:43.824 "num_base_bdevs_discovered": 2, 00:23:43.824 "num_base_bdevs_operational": 2, 00:23:43.824 "base_bdevs_list": [ 00:23:43.824 { 00:23:43.824 "name": "spare", 00:23:43.824 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:43.824 "is_configured": true, 00:23:43.824 "data_offset": 256, 00:23:43.824 "data_size": 7936 00:23:43.824 }, 00:23:43.824 { 00:23:43.824 "name": "BaseBdev2", 00:23:43.824 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:43.824 "is_configured": true, 00:23:43.824 "data_offset": 256, 00:23:43.824 "data_size": 7936 00:23:43.824 } 00:23:43.824 ] 00:23:43.824 }' 00:23:43.824 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:43.824 12:00:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.391 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.649 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:44.649 "name": "raid_bdev1", 00:23:44.650 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:44.650 "strip_size_kb": 0, 00:23:44.650 "state": "online", 00:23:44.650 "raid_level": "raid1", 00:23:44.650 "superblock": true, 00:23:44.650 "num_base_bdevs": 2, 00:23:44.650 "num_base_bdevs_discovered": 2, 00:23:44.650 "num_base_bdevs_operational": 2, 00:23:44.650 "base_bdevs_list": [ 00:23:44.650 { 00:23:44.650 "name": "spare", 00:23:44.650 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:44.650 "is_configured": true, 00:23:44.650 "data_offset": 256, 00:23:44.650 "data_size": 7936 00:23:44.650 }, 00:23:44.650 { 00:23:44.650 "name": "BaseBdev2", 00:23:44.650 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:44.650 "is_configured": true, 00:23:44.650 "data_offset": 256, 00:23:44.650 "data_size": 7936 00:23:44.650 } 00:23:44.650 ] 00:23:44.650 }' 00:23:44.650 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:44.908 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:44.908 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:44.908 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:44.908 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.908 12:00:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:45.167 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.167 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:45.424 [2024-05-14 12:00:12.273736] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.424 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.682 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:45.682 "name": "raid_bdev1", 00:23:45.682 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:45.682 "strip_size_kb": 0, 00:23:45.682 "state": "online", 00:23:45.682 "raid_level": "raid1", 00:23:45.682 "superblock": true, 00:23:45.682 "num_base_bdevs": 2, 00:23:45.682 "num_base_bdevs_discovered": 1, 00:23:45.682 "num_base_bdevs_operational": 1, 00:23:45.682 "base_bdevs_list": [ 00:23:45.682 { 00:23:45.682 "name": null, 00:23:45.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.682 "is_configured": false, 00:23:45.682 "data_offset": 256, 00:23:45.682 "data_size": 7936 00:23:45.682 }, 00:23:45.682 { 00:23:45.682 "name": "BaseBdev2", 00:23:45.682 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:45.682 "is_configured": true, 00:23:45.682 "data_offset": 256, 00:23:45.682 "data_size": 7936 00:23:45.682 } 00:23:45.682 ] 00:23:45.682 }' 00:23:45.682 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:45.682 12:00:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:46.250 12:00:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.508 [2024-05-14 12:00:13.368671] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.509 [2024-05-14 12:00:13.368820] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:46.509 [2024-05-14 12:00:13.368837] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:46.509 [2024-05-14 12:00:13.368865] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.509 [2024-05-14 12:00:13.373655] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b082f0 00:23:46.509 [2024-05-14 12:00:13.375908] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.509 12:00:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # sleep 1 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.457 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:47.715 "name": "raid_bdev1", 00:23:47.715 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:47.715 "strip_size_kb": 0, 00:23:47.715 "state": "online", 00:23:47.715 "raid_level": "raid1", 00:23:47.715 "superblock": true, 00:23:47.715 "num_base_bdevs": 2, 00:23:47.715 "num_base_bdevs_discovered": 2, 00:23:47.715 "num_base_bdevs_operational": 2, 00:23:47.715 "process": { 00:23:47.715 "type": "rebuild", 00:23:47.715 "target": "spare", 00:23:47.715 "progress": { 00:23:47.715 "blocks": 3072, 00:23:47.715 "percent": 38 00:23:47.715 } 00:23:47.715 }, 00:23:47.715 "base_bdevs_list": [ 00:23:47.715 { 00:23:47.715 "name": "spare", 00:23:47.715 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:47.715 "is_configured": true, 00:23:47.715 "data_offset": 256, 00:23:47.715 "data_size": 7936 00:23:47.715 }, 00:23:47.715 { 00:23:47.715 "name": "BaseBdev2", 00:23:47.715 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:47.715 "is_configured": true, 00:23:47.715 "data_offset": 256, 00:23:47.715 "data_size": 7936 00:23:47.715 } 00:23:47.715 ] 00:23:47.715 }' 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.715 12:00:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:47.973 [2024-05-14 12:00:14.962256] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.973 [2024-05-14 12:00:14.988651] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:47.973 [2024-05-14 12:00:14.988696] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.973 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.231 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:48.231 "name": "raid_bdev1", 00:23:48.231 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:48.231 "strip_size_kb": 0, 00:23:48.231 "state": "online", 00:23:48.231 "raid_level": "raid1", 00:23:48.231 "superblock": true, 00:23:48.231 "num_base_bdevs": 2, 00:23:48.231 "num_base_bdevs_discovered": 1, 00:23:48.231 "num_base_bdevs_operational": 1, 00:23:48.231 "base_bdevs_list": [ 00:23:48.231 { 00:23:48.231 "name": null, 00:23:48.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.231 "is_configured": false, 00:23:48.231 "data_offset": 256, 00:23:48.231 "data_size": 7936 00:23:48.231 }, 00:23:48.231 { 00:23:48.231 "name": "BaseBdev2", 00:23:48.231 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:48.232 "is_configured": true, 00:23:48.232 "data_offset": 256, 00:23:48.232 "data_size": 7936 00:23:48.232 } 00:23:48.232 ] 00:23:48.232 }' 00:23:48.232 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:48.232 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:48.797 12:00:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:49.054 [2024-05-14 12:00:16.040537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:49.054 [2024-05-14 12:00:16.040587] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.054 [2024-05-14 12:00:16.040610] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b0d040 00:23:49.054 [2024-05-14 12:00:16.040622] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.054 [2024-05-14 12:00:16.040993] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.054 [2024-05-14 12:00:16.041011] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:49.054 [2024-05-14 12:00:16.041090] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:23:49.054 [2024-05-14 12:00:16.041102] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:49.054 [2024-05-14 12:00:16.041113] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:49.054 [2024-05-14 12:00:16.041132] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.054 [2024-05-14 12:00:16.045954] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb9ed0 00:23:49.054 spare 00:23:49.054 [2024-05-14 12:00:16.047413] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:49.054 12:00:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # sleep 1 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.990 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.247 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:50.247 "name": "raid_bdev1", 00:23:50.247 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:50.247 "strip_size_kb": 0, 00:23:50.247 "state": "online", 00:23:50.247 "raid_level": "raid1", 00:23:50.247 "superblock": true, 00:23:50.247 "num_base_bdevs": 2, 00:23:50.247 "num_base_bdevs_discovered": 2, 00:23:50.247 "num_base_bdevs_operational": 2, 00:23:50.247 "process": { 00:23:50.247 "type": "rebuild", 00:23:50.247 "target": "spare", 00:23:50.247 "progress": { 00:23:50.247 "blocks": 3072, 00:23:50.247 "percent": 38 00:23:50.247 } 00:23:50.247 }, 00:23:50.247 "base_bdevs_list": [ 00:23:50.247 { 00:23:50.247 "name": "spare", 00:23:50.247 "uuid": "2ab9c963-4d44-588a-b49a-fa56b92d24d2", 00:23:50.247 "is_configured": true, 00:23:50.247 "data_offset": 256, 00:23:50.247 "data_size": 7936 00:23:50.247 }, 00:23:50.247 { 00:23:50.247 "name": "BaseBdev2", 00:23:50.247 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:50.247 "is_configured": true, 00:23:50.247 "data_offset": 256, 00:23:50.247 "data_size": 7936 00:23:50.247 } 00:23:50.247 ] 00:23:50.247 }' 00:23:50.247 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:50.506 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.506 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:50.506 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.506 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:50.765 [2024-05-14 12:00:17.626578] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.765 [2024-05-14 12:00:17.660100] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:50.765 [2024-05-14 12:00:17.660146] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.765 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.025 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:51.025 "name": "raid_bdev1", 00:23:51.025 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:51.025 "strip_size_kb": 0, 00:23:51.025 "state": "online", 00:23:51.025 "raid_level": "raid1", 00:23:51.025 "superblock": true, 00:23:51.025 "num_base_bdevs": 2, 00:23:51.025 "num_base_bdevs_discovered": 1, 00:23:51.025 "num_base_bdevs_operational": 1, 00:23:51.025 "base_bdevs_list": [ 00:23:51.025 { 00:23:51.025 "name": null, 00:23:51.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.025 "is_configured": false, 00:23:51.025 "data_offset": 256, 00:23:51.025 "data_size": 7936 00:23:51.025 }, 00:23:51.025 { 00:23:51.025 "name": "BaseBdev2", 00:23:51.025 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:51.025 "is_configured": true, 00:23:51.025 "data_offset": 256, 00:23:51.025 "data_size": 7936 00:23:51.025 } 00:23:51.025 ] 00:23:51.025 }' 00:23:51.025 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:51.025 12:00:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.592 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:51.851 "name": "raid_bdev1", 00:23:51.851 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:51.851 "strip_size_kb": 0, 00:23:51.851 "state": "online", 00:23:51.851 "raid_level": "raid1", 00:23:51.851 "superblock": true, 00:23:51.851 "num_base_bdevs": 2, 00:23:51.851 "num_base_bdevs_discovered": 1, 00:23:51.851 "num_base_bdevs_operational": 1, 00:23:51.851 "base_bdevs_list": [ 00:23:51.851 { 00:23:51.851 "name": null, 00:23:51.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.851 "is_configured": false, 00:23:51.851 "data_offset": 256, 00:23:51.851 "data_size": 7936 00:23:51.851 }, 00:23:51.851 { 00:23:51.851 "name": "BaseBdev2", 00:23:51.851 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:51.851 "is_configured": true, 00:23:51.851 "data_offset": 256, 00:23:51.851 "data_size": 7936 00:23:51.851 } 00:23:51.851 ] 00:23:51.851 }' 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:51.851 12:00:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:52.111 12:00:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:52.370 [2024-05-14 12:00:19.317642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:52.370 [2024-05-14 12:00:19.317690] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.370 [2024-05-14 12:00:19.317710] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cbadd0 00:23:52.370 [2024-05-14 12:00:19.317723] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.370 [2024-05-14 12:00:19.318072] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.370 [2024-05-14 12:00:19.318089] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:52.370 [2024-05-14 12:00:19.318151] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:52.370 [2024-05-14 12:00:19.318163] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:52.370 [2024-05-14 12:00:19.318174] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:52.370 BaseBdev1 00:23:52.370 12:00:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@786 -- # sleep 1 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:53.307 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.308 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.567 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:53.567 "name": "raid_bdev1", 00:23:53.567 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:53.567 "strip_size_kb": 0, 00:23:53.567 "state": "online", 00:23:53.567 "raid_level": "raid1", 00:23:53.567 "superblock": true, 00:23:53.567 "num_base_bdevs": 2, 00:23:53.567 "num_base_bdevs_discovered": 1, 00:23:53.567 "num_base_bdevs_operational": 1, 00:23:53.567 "base_bdevs_list": [ 00:23:53.567 { 00:23:53.567 "name": null, 00:23:53.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.567 "is_configured": false, 00:23:53.567 "data_offset": 256, 00:23:53.567 "data_size": 7936 00:23:53.567 }, 00:23:53.567 { 00:23:53.567 "name": "BaseBdev2", 00:23:53.567 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:53.567 "is_configured": true, 00:23:53.567 "data_offset": 256, 00:23:53.567 "data_size": 7936 00:23:53.567 } 00:23:53.567 ] 00:23:53.567 }' 00:23:53.567 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:53.567 12:00:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.153 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.411 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:54.411 "name": "raid_bdev1", 00:23:54.411 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:54.411 "strip_size_kb": 0, 00:23:54.411 "state": "online", 00:23:54.411 "raid_level": "raid1", 00:23:54.411 "superblock": true, 00:23:54.411 "num_base_bdevs": 2, 00:23:54.411 "num_base_bdevs_discovered": 1, 00:23:54.411 "num_base_bdevs_operational": 1, 00:23:54.411 "base_bdevs_list": [ 00:23:54.411 { 00:23:54.411 "name": null, 00:23:54.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.411 "is_configured": false, 00:23:54.411 "data_offset": 256, 00:23:54.411 "data_size": 7936 00:23:54.411 }, 00:23:54.411 { 00:23:54.411 "name": "BaseBdev2", 00:23:54.411 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:54.411 "is_configured": true, 00:23:54.411 "data_offset": 256, 00:23:54.411 "data_size": 7936 00:23:54.411 } 00:23:54.411 ] 00:23:54.411 }' 00:23:54.411 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:54.412 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:54.412 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:54.412 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:54.412 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.412 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.671 [2024-05-14 12:00:21.667883] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.671 [2024-05-14 12:00:21.668014] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:54.671 [2024-05-14 12:00:21.668029] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:54.671 request: 00:23:54.671 { 00:23:54.671 "raid_bdev": "raid_bdev1", 00:23:54.671 "base_bdev": "BaseBdev1", 00:23:54.671 "method": "bdev_raid_add_base_bdev", 00:23:54.671 "req_id": 1 00:23:54.671 } 00:23:54.671 Got JSON-RPC error response 00:23:54.671 response: 00:23:54.671 { 00:23:54.671 "code": -22, 00:23:54.671 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:54.671 } 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:54.671 12:00:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # sleep 1 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.049 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:56.049 "name": "raid_bdev1", 00:23:56.049 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:56.049 "strip_size_kb": 0, 00:23:56.049 "state": "online", 00:23:56.049 "raid_level": "raid1", 00:23:56.049 "superblock": true, 00:23:56.049 "num_base_bdevs": 2, 00:23:56.049 "num_base_bdevs_discovered": 1, 00:23:56.049 "num_base_bdevs_operational": 1, 00:23:56.049 "base_bdevs_list": [ 00:23:56.049 { 00:23:56.049 "name": null, 00:23:56.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.049 "is_configured": false, 00:23:56.049 "data_offset": 256, 00:23:56.049 "data_size": 7936 00:23:56.049 }, 00:23:56.050 { 00:23:56.050 "name": "BaseBdev2", 00:23:56.050 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:56.050 "is_configured": true, 00:23:56.050 "data_offset": 256, 00:23:56.050 "data_size": 7936 00:23:56.050 } 00:23:56.050 ] 00:23:56.050 }' 00:23:56.050 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:56.050 12:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.624 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:56.884 "name": "raid_bdev1", 00:23:56.884 "uuid": "f4a6f457-c148-4ebe-b0e1-b76c86142e51", 00:23:56.884 "strip_size_kb": 0, 00:23:56.884 "state": "online", 00:23:56.884 "raid_level": "raid1", 00:23:56.884 "superblock": true, 00:23:56.884 "num_base_bdevs": 2, 00:23:56.884 "num_base_bdevs_discovered": 1, 00:23:56.884 "num_base_bdevs_operational": 1, 00:23:56.884 "base_bdevs_list": [ 00:23:56.884 { 00:23:56.884 "name": null, 00:23:56.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.884 "is_configured": false, 00:23:56.884 "data_offset": 256, 00:23:56.884 "data_size": 7936 00:23:56.884 }, 00:23:56.884 { 00:23:56.884 "name": "BaseBdev2", 00:23:56.884 "uuid": "4b7fc474-7d82-59e4-b0e1-06738b2ceab2", 00:23:56.884 "is_configured": true, 00:23:56.884 "data_offset": 256, 00:23:56.884 "data_size": 7936 00:23:56.884 } 00:23:56.884 ] 00:23:56.884 }' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # killprocess 1785952 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 1785952 ']' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 1785952 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1785952 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1785952' 00:23:56.884 killing process with pid 1785952 00:23:56.884 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@965 -- # kill 1785952 00:23:56.885 Received shutdown signal, test time was about 60.000000 seconds 00:23:56.885 00:23:56.885 Latency(us) 00:23:56.885 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.885 =================================================================================================================== 00:23:56.885 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:56.885 [2024-05-14 12:00:23.941654] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:56.885 [2024-05-14 12:00:23.941749] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.885 [2024-05-14 12:00:23.941792] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.885 [2024-05-14 12:00:23.941804] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b07a70 name raid_bdev1, state offline 00:23:56.885 12:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@970 -- # wait 1785952 00:23:56.885 [2024-05-14 12:00:23.968482] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:57.143 12:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@797 -- # return 0 00:23:57.143 00:23:57.143 real 0m32.052s 00:23:57.143 user 0m50.151s 00:23:57.143 sys 0m5.230s 00:23:57.143 12:00:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:57.143 12:00:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:57.143 ************************************ 00:23:57.143 END TEST raid_rebuild_test_sb_4k 00:23:57.143 ************************************ 00:23:57.143 12:00:24 bdev_raid -- bdev/bdev_raid.sh@850 -- # base_malloc_params='-m 32' 00:23:57.143 12:00:24 bdev_raid -- bdev/bdev_raid.sh@851 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:23:57.143 12:00:24 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:23:57.143 12:00:24 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:57.143 12:00:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:57.402 ************************************ 00:23:57.402 START TEST raid_state_function_test_sb_md_separate 00:23:57.402 ************************************ 00:23:57.402 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:23:57.402 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:23:57.402 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:23:57.402 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:23:57.402 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # raid_pid=1790978 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1790978' 00:23:57.403 Process raid pid: 1790978 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@247 -- # waitforlisten 1790978 /var/tmp/spdk-raid.sock 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 1790978 ']' 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:57.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:57.403 12:00:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:57.403 [2024-05-14 12:00:24.320857] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:23:57.403 [2024-05-14 12:00:24.320924] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:57.403 [2024-05-14 12:00:24.450540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:57.669 [2024-05-14 12:00:24.552565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:57.669 [2024-05-14 12:00:24.622061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:57.669 [2024-05-14 12:00:24.622098] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:58.240 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:58.240 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:23:58.240 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:58.500 [2024-05-14 12:00:25.384751] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:58.500 [2024-05-14 12:00:25.384793] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:58.500 [2024-05-14 12:00:25.384804] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:58.500 [2024-05-14 12:00:25.384817] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.500 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:58.759 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:58.759 "name": "Existed_Raid", 00:23:58.759 "uuid": "62033179-4005-4228-b415-add0a9b46cbf", 00:23:58.759 "strip_size_kb": 0, 00:23:58.759 "state": "configuring", 00:23:58.759 "raid_level": "raid1", 00:23:58.759 "superblock": true, 00:23:58.759 "num_base_bdevs": 2, 00:23:58.759 "num_base_bdevs_discovered": 0, 00:23:58.759 "num_base_bdevs_operational": 2, 00:23:58.759 "base_bdevs_list": [ 00:23:58.759 { 00:23:58.759 "name": "BaseBdev1", 00:23:58.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.759 "is_configured": false, 00:23:58.759 "data_offset": 0, 00:23:58.759 "data_size": 0 00:23:58.759 }, 00:23:58.759 { 00:23:58.759 "name": "BaseBdev2", 00:23:58.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.759 "is_configured": false, 00:23:58.759 "data_offset": 0, 00:23:58.759 "data_size": 0 00:23:58.759 } 00:23:58.759 ] 00:23:58.759 }' 00:23:58.759 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:58.759 12:00:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:59.329 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:59.587 [2024-05-14 12:00:26.455451] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:59.587 [2024-05-14 12:00:26.455485] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa3700 name Existed_Raid, state configuring 00:23:59.587 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:59.846 [2024-05-14 12:00:26.696102] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:59.846 [2024-05-14 12:00:26.696136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:59.846 [2024-05-14 12:00:26.696146] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:59.846 [2024-05-14 12:00:26.696158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:59.846 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:24:00.116 [2024-05-14 12:00:26.948559] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:00.116 BaseBdev1 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:00.116 12:00:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:00.379 [ 00:24:00.379 { 00:24:00.379 "name": "BaseBdev1", 00:24:00.379 "aliases": [ 00:24:00.379 "92c59950-46f2-4fb9-8acb-cdf1178f7033" 00:24:00.379 ], 00:24:00.379 "product_name": "Malloc disk", 00:24:00.379 "block_size": 4096, 00:24:00.379 "num_blocks": 8192, 00:24:00.379 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:00.379 "md_size": 32, 00:24:00.379 "md_interleave": false, 00:24:00.379 "dif_type": 0, 00:24:00.379 "assigned_rate_limits": { 00:24:00.379 "rw_ios_per_sec": 0, 00:24:00.379 "rw_mbytes_per_sec": 0, 00:24:00.379 "r_mbytes_per_sec": 0, 00:24:00.379 "w_mbytes_per_sec": 0 00:24:00.379 }, 00:24:00.379 "claimed": true, 00:24:00.379 "claim_type": "exclusive_write", 00:24:00.379 "zoned": false, 00:24:00.379 "supported_io_types": { 00:24:00.379 "read": true, 00:24:00.379 "write": true, 00:24:00.379 "unmap": true, 00:24:00.379 "write_zeroes": true, 00:24:00.379 "flush": true, 00:24:00.379 "reset": true, 00:24:00.379 "compare": false, 00:24:00.379 "compare_and_write": false, 00:24:00.379 "abort": true, 00:24:00.379 "nvme_admin": false, 00:24:00.379 "nvme_io": false 00:24:00.379 }, 00:24:00.379 "memory_domains": [ 00:24:00.379 { 00:24:00.379 "dma_device_id": "system", 00:24:00.379 "dma_device_type": 1 00:24:00.379 }, 00:24:00.379 { 00:24:00.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:00.379 "dma_device_type": 2 00:24:00.379 } 00:24:00.379 ], 00:24:00.379 "driver_specific": {} 00:24:00.379 } 00:24:00.379 ] 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.379 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:00.638 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:00.638 "name": "Existed_Raid", 00:24:00.638 "uuid": "eb2c9f2f-2632-4161-9250-e04748ae3f86", 00:24:00.638 "strip_size_kb": 0, 00:24:00.638 "state": "configuring", 00:24:00.638 "raid_level": "raid1", 00:24:00.638 "superblock": true, 00:24:00.638 "num_base_bdevs": 2, 00:24:00.638 "num_base_bdevs_discovered": 1, 00:24:00.638 "num_base_bdevs_operational": 2, 00:24:00.638 "base_bdevs_list": [ 00:24:00.638 { 00:24:00.638 "name": "BaseBdev1", 00:24:00.638 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:00.638 "is_configured": true, 00:24:00.638 "data_offset": 256, 00:24:00.638 "data_size": 7936 00:24:00.638 }, 00:24:00.638 { 00:24:00.638 "name": "BaseBdev2", 00:24:00.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.638 "is_configured": false, 00:24:00.638 "data_offset": 0, 00:24:00.638 "data_size": 0 00:24:00.638 } 00:24:00.638 ] 00:24:00.638 }' 00:24:00.638 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:00.638 12:00:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:01.205 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:01.462 [2024-05-14 12:00:28.480641] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:01.462 [2024-05-14 12:00:28.480684] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa39a0 name Existed_Raid, state configuring 00:24:01.462 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:01.720 [2024-05-14 12:00:28.721313] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:01.720 [2024-05-14 12:00:28.722878] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:01.720 [2024-05-14 12:00:28.722911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.720 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:01.978 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:01.979 "name": "Existed_Raid", 00:24:01.979 "uuid": "4cb21dac-26cf-4814-a687-51551d686266", 00:24:01.979 "strip_size_kb": 0, 00:24:01.979 "state": "configuring", 00:24:01.979 "raid_level": "raid1", 00:24:01.979 "superblock": true, 00:24:01.979 "num_base_bdevs": 2, 00:24:01.979 "num_base_bdevs_discovered": 1, 00:24:01.979 "num_base_bdevs_operational": 2, 00:24:01.979 "base_bdevs_list": [ 00:24:01.979 { 00:24:01.979 "name": "BaseBdev1", 00:24:01.979 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:01.979 "is_configured": true, 00:24:01.979 "data_offset": 256, 00:24:01.979 "data_size": 7936 00:24:01.979 }, 00:24:01.979 { 00:24:01.979 "name": "BaseBdev2", 00:24:01.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.979 "is_configured": false, 00:24:01.979 "data_offset": 0, 00:24:01.979 "data_size": 0 00:24:01.979 } 00:24:01.979 ] 00:24:01.979 }' 00:24:01.979 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:01.979 12:00:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:02.547 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:24:02.808 [2024-05-14 12:00:29.820293] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:02.808 [2024-05-14 12:00:29.820444] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa2ff0 00:24:02.808 [2024-05-14 12:00:29.820459] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:02.808 [2024-05-14 12:00:29.820521] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa3ce0 00:24:02.808 [2024-05-14 12:00:29.820617] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa2ff0 00:24:02.808 [2024-05-14 12:00:29.820627] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaa2ff0 00:24:02.808 [2024-05-14 12:00:29.820692] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:02.808 BaseBdev2 00:24:02.808 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:24:02.808 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:24:02.808 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:02.809 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:24:02.809 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:02.809 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:02.809 12:00:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:03.068 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:03.327 [ 00:24:03.327 { 00:24:03.327 "name": "BaseBdev2", 00:24:03.327 "aliases": [ 00:24:03.327 "f487bd28-4513-4898-bbf8-ee7ee2f612d8" 00:24:03.327 ], 00:24:03.327 "product_name": "Malloc disk", 00:24:03.327 "block_size": 4096, 00:24:03.327 "num_blocks": 8192, 00:24:03.327 "uuid": "f487bd28-4513-4898-bbf8-ee7ee2f612d8", 00:24:03.327 "md_size": 32, 00:24:03.327 "md_interleave": false, 00:24:03.327 "dif_type": 0, 00:24:03.327 "assigned_rate_limits": { 00:24:03.327 "rw_ios_per_sec": 0, 00:24:03.327 "rw_mbytes_per_sec": 0, 00:24:03.327 "r_mbytes_per_sec": 0, 00:24:03.327 "w_mbytes_per_sec": 0 00:24:03.327 }, 00:24:03.327 "claimed": true, 00:24:03.327 "claim_type": "exclusive_write", 00:24:03.327 "zoned": false, 00:24:03.327 "supported_io_types": { 00:24:03.327 "read": true, 00:24:03.327 "write": true, 00:24:03.327 "unmap": true, 00:24:03.327 "write_zeroes": true, 00:24:03.327 "flush": true, 00:24:03.327 "reset": true, 00:24:03.327 "compare": false, 00:24:03.327 "compare_and_write": false, 00:24:03.327 "abort": true, 00:24:03.327 "nvme_admin": false, 00:24:03.327 "nvme_io": false 00:24:03.327 }, 00:24:03.327 "memory_domains": [ 00:24:03.327 { 00:24:03.327 "dma_device_id": "system", 00:24:03.327 "dma_device_type": 1 00:24:03.327 }, 00:24:03.327 { 00:24:03.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:03.327 "dma_device_type": 2 00:24:03.327 } 00:24:03.327 ], 00:24:03.327 "driver_specific": {} 00:24:03.327 } 00:24:03.327 ] 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.327 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:03.585 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:03.585 "name": "Existed_Raid", 00:24:03.585 "uuid": "4cb21dac-26cf-4814-a687-51551d686266", 00:24:03.585 "strip_size_kb": 0, 00:24:03.585 "state": "online", 00:24:03.585 "raid_level": "raid1", 00:24:03.585 "superblock": true, 00:24:03.585 "num_base_bdevs": 2, 00:24:03.585 "num_base_bdevs_discovered": 2, 00:24:03.585 "num_base_bdevs_operational": 2, 00:24:03.585 "base_bdevs_list": [ 00:24:03.585 { 00:24:03.585 "name": "BaseBdev1", 00:24:03.585 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:03.585 "is_configured": true, 00:24:03.585 "data_offset": 256, 00:24:03.585 "data_size": 7936 00:24:03.585 }, 00:24:03.585 { 00:24:03.585 "name": "BaseBdev2", 00:24:03.585 "uuid": "f487bd28-4513-4898-bbf8-ee7ee2f612d8", 00:24:03.585 "is_configured": true, 00:24:03.585 "data_offset": 256, 00:24:03.585 "data_size": 7936 00:24:03.585 } 00:24:03.585 ] 00:24:03.585 }' 00:24:03.585 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:03.585 12:00:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:04.150 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:04.408 [2024-05-14 12:00:31.264408] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:04.408 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:04.408 "name": "Existed_Raid", 00:24:04.408 "aliases": [ 00:24:04.408 "4cb21dac-26cf-4814-a687-51551d686266" 00:24:04.408 ], 00:24:04.408 "product_name": "Raid Volume", 00:24:04.408 "block_size": 4096, 00:24:04.408 "num_blocks": 7936, 00:24:04.408 "uuid": "4cb21dac-26cf-4814-a687-51551d686266", 00:24:04.408 "md_size": 32, 00:24:04.408 "md_interleave": false, 00:24:04.408 "dif_type": 0, 00:24:04.408 "assigned_rate_limits": { 00:24:04.408 "rw_ios_per_sec": 0, 00:24:04.408 "rw_mbytes_per_sec": 0, 00:24:04.408 "r_mbytes_per_sec": 0, 00:24:04.408 "w_mbytes_per_sec": 0 00:24:04.408 }, 00:24:04.408 "claimed": false, 00:24:04.408 "zoned": false, 00:24:04.408 "supported_io_types": { 00:24:04.408 "read": true, 00:24:04.408 "write": true, 00:24:04.408 "unmap": false, 00:24:04.408 "write_zeroes": true, 00:24:04.408 "flush": false, 00:24:04.408 "reset": true, 00:24:04.409 "compare": false, 00:24:04.409 "compare_and_write": false, 00:24:04.409 "abort": false, 00:24:04.409 "nvme_admin": false, 00:24:04.409 "nvme_io": false 00:24:04.409 }, 00:24:04.409 "memory_domains": [ 00:24:04.409 { 00:24:04.409 "dma_device_id": "system", 00:24:04.409 "dma_device_type": 1 00:24:04.409 }, 00:24:04.409 { 00:24:04.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:04.409 "dma_device_type": 2 00:24:04.409 }, 00:24:04.409 { 00:24:04.409 "dma_device_id": "system", 00:24:04.409 "dma_device_type": 1 00:24:04.409 }, 00:24:04.409 { 00:24:04.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:04.409 "dma_device_type": 2 00:24:04.409 } 00:24:04.409 ], 00:24:04.409 "driver_specific": { 00:24:04.409 "raid": { 00:24:04.409 "uuid": "4cb21dac-26cf-4814-a687-51551d686266", 00:24:04.409 "strip_size_kb": 0, 00:24:04.409 "state": "online", 00:24:04.409 "raid_level": "raid1", 00:24:04.409 "superblock": true, 00:24:04.409 "num_base_bdevs": 2, 00:24:04.409 "num_base_bdevs_discovered": 2, 00:24:04.409 "num_base_bdevs_operational": 2, 00:24:04.409 "base_bdevs_list": [ 00:24:04.409 { 00:24:04.409 "name": "BaseBdev1", 00:24:04.409 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:04.409 "is_configured": true, 00:24:04.409 "data_offset": 256, 00:24:04.409 "data_size": 7936 00:24:04.409 }, 00:24:04.409 { 00:24:04.409 "name": "BaseBdev2", 00:24:04.409 "uuid": "f487bd28-4513-4898-bbf8-ee7ee2f612d8", 00:24:04.409 "is_configured": true, 00:24:04.409 "data_offset": 256, 00:24:04.409 "data_size": 7936 00:24:04.409 } 00:24:04.409 ] 00:24:04.409 } 00:24:04.409 } 00:24:04.409 }' 00:24:04.409 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:04.409 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:24:04.409 BaseBdev2' 00:24:04.409 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:04.409 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:04.409 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:04.667 "name": "BaseBdev1", 00:24:04.667 "aliases": [ 00:24:04.667 "92c59950-46f2-4fb9-8acb-cdf1178f7033" 00:24:04.667 ], 00:24:04.667 "product_name": "Malloc disk", 00:24:04.667 "block_size": 4096, 00:24:04.667 "num_blocks": 8192, 00:24:04.667 "uuid": "92c59950-46f2-4fb9-8acb-cdf1178f7033", 00:24:04.667 "md_size": 32, 00:24:04.667 "md_interleave": false, 00:24:04.667 "dif_type": 0, 00:24:04.667 "assigned_rate_limits": { 00:24:04.667 "rw_ios_per_sec": 0, 00:24:04.667 "rw_mbytes_per_sec": 0, 00:24:04.667 "r_mbytes_per_sec": 0, 00:24:04.667 "w_mbytes_per_sec": 0 00:24:04.667 }, 00:24:04.667 "claimed": true, 00:24:04.667 "claim_type": "exclusive_write", 00:24:04.667 "zoned": false, 00:24:04.667 "supported_io_types": { 00:24:04.667 "read": true, 00:24:04.667 "write": true, 00:24:04.667 "unmap": true, 00:24:04.667 "write_zeroes": true, 00:24:04.667 "flush": true, 00:24:04.667 "reset": true, 00:24:04.667 "compare": false, 00:24:04.667 "compare_and_write": false, 00:24:04.667 "abort": true, 00:24:04.667 "nvme_admin": false, 00:24:04.667 "nvme_io": false 00:24:04.667 }, 00:24:04.667 "memory_domains": [ 00:24:04.667 { 00:24:04.667 "dma_device_id": "system", 00:24:04.667 "dma_device_type": 1 00:24:04.667 }, 00:24:04.667 { 00:24:04.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:04.667 "dma_device_type": 2 00:24:04.667 } 00:24:04.667 ], 00:24:04.667 "driver_specific": {} 00:24:04.667 }' 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:04.667 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:04.925 12:00:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:05.183 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:05.183 "name": "BaseBdev2", 00:24:05.183 "aliases": [ 00:24:05.183 "f487bd28-4513-4898-bbf8-ee7ee2f612d8" 00:24:05.183 ], 00:24:05.183 "product_name": "Malloc disk", 00:24:05.183 "block_size": 4096, 00:24:05.183 "num_blocks": 8192, 00:24:05.183 "uuid": "f487bd28-4513-4898-bbf8-ee7ee2f612d8", 00:24:05.183 "md_size": 32, 00:24:05.183 "md_interleave": false, 00:24:05.183 "dif_type": 0, 00:24:05.183 "assigned_rate_limits": { 00:24:05.183 "rw_ios_per_sec": 0, 00:24:05.183 "rw_mbytes_per_sec": 0, 00:24:05.183 "r_mbytes_per_sec": 0, 00:24:05.183 "w_mbytes_per_sec": 0 00:24:05.183 }, 00:24:05.183 "claimed": true, 00:24:05.183 "claim_type": "exclusive_write", 00:24:05.183 "zoned": false, 00:24:05.183 "supported_io_types": { 00:24:05.183 "read": true, 00:24:05.183 "write": true, 00:24:05.183 "unmap": true, 00:24:05.183 "write_zeroes": true, 00:24:05.183 "flush": true, 00:24:05.183 "reset": true, 00:24:05.183 "compare": false, 00:24:05.183 "compare_and_write": false, 00:24:05.183 "abort": true, 00:24:05.183 "nvme_admin": false, 00:24:05.183 "nvme_io": false 00:24:05.183 }, 00:24:05.183 "memory_domains": [ 00:24:05.183 { 00:24:05.183 "dma_device_id": "system", 00:24:05.183 "dma_device_type": 1 00:24:05.183 }, 00:24:05.183 { 00:24:05.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:05.183 "dma_device_type": 2 00:24:05.183 } 00:24:05.183 ], 00:24:05.184 "driver_specific": {} 00:24:05.184 }' 00:24:05.184 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:05.184 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:05.184 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:05.184 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:05.440 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:05.441 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:05.697 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:05.697 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:05.697 [2024-05-14 12:00:32.752153] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:05.697 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # local expected_state 00:24:05.697 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.955 12:00:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:05.955 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:05.955 "name": "Existed_Raid", 00:24:05.955 "uuid": "4cb21dac-26cf-4814-a687-51551d686266", 00:24:05.955 "strip_size_kb": 0, 00:24:05.955 "state": "online", 00:24:05.956 "raid_level": "raid1", 00:24:05.956 "superblock": true, 00:24:05.956 "num_base_bdevs": 2, 00:24:05.956 "num_base_bdevs_discovered": 1, 00:24:05.956 "num_base_bdevs_operational": 1, 00:24:05.956 "base_bdevs_list": [ 00:24:05.956 { 00:24:05.956 "name": null, 00:24:05.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.956 "is_configured": false, 00:24:05.956 "data_offset": 256, 00:24:05.956 "data_size": 7936 00:24:05.956 }, 00:24:05.956 { 00:24:05.956 "name": "BaseBdev2", 00:24:05.956 "uuid": "f487bd28-4513-4898-bbf8-ee7ee2f612d8", 00:24:05.956 "is_configured": true, 00:24:05.956 "data_offset": 256, 00:24:05.956 "data_size": 7936 00:24:05.956 } 00:24:05.956 ] 00:24:05.956 }' 00:24:05.956 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:05.956 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:06.890 12:00:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:07.148 [2024-05-14 12:00:34.072255] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:07.148 [2024-05-14 12:00:34.072338] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:07.148 [2024-05-14 12:00:34.083870] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:07.148 [2024-05-14 12:00:34.083936] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:07.148 [2024-05-14 12:00:34.083949] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa2ff0 name Existed_Raid, state offline 00:24:07.149 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:24:07.149 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:24:07.149 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.149 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@342 -- # killprocess 1790978 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 1790978 ']' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 1790978 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1790978 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1790978' 00:24:07.407 killing process with pid 1790978 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 1790978 00:24:07.407 [2024-05-14 12:00:34.404612] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:07.407 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 1790978 00:24:07.407 [2024-05-14 12:00:34.405481] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:07.666 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@344 -- # return 0 00:24:07.666 00:24:07.666 real 0m10.348s 00:24:07.666 user 0m18.312s 00:24:07.666 sys 0m1.996s 00:24:07.666 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:07.666 12:00:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:07.666 ************************************ 00:24:07.666 END TEST raid_state_function_test_sb_md_separate 00:24:07.666 ************************************ 00:24:07.666 12:00:34 bdev_raid -- bdev/bdev_raid.sh@852 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:24:07.666 12:00:34 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:24:07.666 12:00:34 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:07.666 12:00:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:07.666 ************************************ 00:24:07.666 START TEST raid_superblock_test_md_separate 00:24:07.666 ************************************ 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # raid_pid=1792591 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # waitforlisten 1792591 /var/tmp/spdk-raid.sock 00:24:07.666 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@827 -- # '[' -z 1792591 ']' 00:24:07.667 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:07.667 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:07.667 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:07.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:07.667 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:07.667 12:00:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:07.667 [2024-05-14 12:00:34.731782] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:24:07.667 [2024-05-14 12:00:34.731823] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1792591 ] 00:24:07.926 [2024-05-14 12:00:34.842578] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:07.926 [2024-05-14 12:00:34.950523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:08.185 [2024-05-14 12:00:35.017391] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:08.185 [2024-05-14 12:00:35.017430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # return 0 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:08.751 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:24:09.009 malloc1 00:24:09.009 12:00:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:09.267 [2024-05-14 12:00:36.148722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:09.267 [2024-05-14 12:00:36.148771] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:09.267 [2024-05-14 12:00:36.148793] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb45df0 00:24:09.267 [2024-05-14 12:00:36.148805] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:09.267 [2024-05-14 12:00:36.150284] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:09.267 [2024-05-14 12:00:36.150310] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:09.267 pt1 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:09.267 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:24:09.525 malloc2 00:24:09.525 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:09.784 [2024-05-14 12:00:36.627470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:09.784 [2024-05-14 12:00:36.627513] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:09.784 [2024-05-14 12:00:36.627532] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc9ca90 00:24:09.784 [2024-05-14 12:00:36.627545] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:09.784 [2024-05-14 12:00:36.628991] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:09.784 [2024-05-14 12:00:36.629018] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:09.784 pt2 00:24:09.784 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:24:09.784 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:24:09.784 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:09.784 [2024-05-14 12:00:36.868127] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:09.784 [2024-05-14 12:00:36.869548] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:10.041 [2024-05-14 12:00:36.869710] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc565e0 00:24:10.041 [2024-05-14 12:00:36.869723] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:10.041 [2024-05-14 12:00:36.869797] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb466a0 00:24:10.041 [2024-05-14 12:00:36.869916] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc565e0 00:24:10.041 [2024-05-14 12:00:36.869926] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc565e0 00:24:10.041 [2024-05-14 12:00:36.870002] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.041 12:00:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.299 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:10.299 "name": "raid_bdev1", 00:24:10.299 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:10.299 "strip_size_kb": 0, 00:24:10.299 "state": "online", 00:24:10.299 "raid_level": "raid1", 00:24:10.299 "superblock": true, 00:24:10.299 "num_base_bdevs": 2, 00:24:10.299 "num_base_bdevs_discovered": 2, 00:24:10.299 "num_base_bdevs_operational": 2, 00:24:10.299 "base_bdevs_list": [ 00:24:10.299 { 00:24:10.299 "name": "pt1", 00:24:10.299 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:10.299 "is_configured": true, 00:24:10.299 "data_offset": 256, 00:24:10.299 "data_size": 7936 00:24:10.299 }, 00:24:10.299 { 00:24:10.299 "name": "pt2", 00:24:10.299 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:10.299 "is_configured": true, 00:24:10.299 "data_offset": 256, 00:24:10.299 "data_size": 7936 00:24:10.299 } 00:24:10.299 ] 00:24:10.299 }' 00:24:10.299 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:10.299 12:00:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:10.894 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:11.164 [2024-05-14 12:00:37.963375] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:11.164 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:11.164 "name": "raid_bdev1", 00:24:11.164 "aliases": [ 00:24:11.164 "58a904a9-a709-42b2-8563-6368a596d3e0" 00:24:11.164 ], 00:24:11.164 "product_name": "Raid Volume", 00:24:11.164 "block_size": 4096, 00:24:11.164 "num_blocks": 7936, 00:24:11.164 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:11.164 "md_size": 32, 00:24:11.164 "md_interleave": false, 00:24:11.164 "dif_type": 0, 00:24:11.164 "assigned_rate_limits": { 00:24:11.164 "rw_ios_per_sec": 0, 00:24:11.164 "rw_mbytes_per_sec": 0, 00:24:11.164 "r_mbytes_per_sec": 0, 00:24:11.164 "w_mbytes_per_sec": 0 00:24:11.164 }, 00:24:11.164 "claimed": false, 00:24:11.164 "zoned": false, 00:24:11.164 "supported_io_types": { 00:24:11.164 "read": true, 00:24:11.164 "write": true, 00:24:11.164 "unmap": false, 00:24:11.164 "write_zeroes": true, 00:24:11.164 "flush": false, 00:24:11.164 "reset": true, 00:24:11.164 "compare": false, 00:24:11.164 "compare_and_write": false, 00:24:11.164 "abort": false, 00:24:11.164 "nvme_admin": false, 00:24:11.164 "nvme_io": false 00:24:11.164 }, 00:24:11.164 "memory_domains": [ 00:24:11.164 { 00:24:11.164 "dma_device_id": "system", 00:24:11.164 "dma_device_type": 1 00:24:11.164 }, 00:24:11.164 { 00:24:11.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:11.164 "dma_device_type": 2 00:24:11.164 }, 00:24:11.164 { 00:24:11.164 "dma_device_id": "system", 00:24:11.164 "dma_device_type": 1 00:24:11.164 }, 00:24:11.164 { 00:24:11.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:11.164 "dma_device_type": 2 00:24:11.164 } 00:24:11.164 ], 00:24:11.164 "driver_specific": { 00:24:11.164 "raid": { 00:24:11.164 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:11.164 "strip_size_kb": 0, 00:24:11.164 "state": "online", 00:24:11.164 "raid_level": "raid1", 00:24:11.164 "superblock": true, 00:24:11.164 "num_base_bdevs": 2, 00:24:11.164 "num_base_bdevs_discovered": 2, 00:24:11.164 "num_base_bdevs_operational": 2, 00:24:11.164 "base_bdevs_list": [ 00:24:11.164 { 00:24:11.164 "name": "pt1", 00:24:11.164 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:11.164 "is_configured": true, 00:24:11.164 "data_offset": 256, 00:24:11.164 "data_size": 7936 00:24:11.164 }, 00:24:11.164 { 00:24:11.164 "name": "pt2", 00:24:11.164 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:11.164 "is_configured": true, 00:24:11.164 "data_offset": 256, 00:24:11.164 "data_size": 7936 00:24:11.164 } 00:24:11.164 ] 00:24:11.164 } 00:24:11.164 } 00:24:11.164 }' 00:24:11.164 12:00:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:11.164 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:24:11.164 pt2' 00:24:11.164 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:11.164 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:11.164 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:11.425 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:11.426 "name": "pt1", 00:24:11.426 "aliases": [ 00:24:11.426 "a8b1b994-b58a-5960-a742-add925e7783c" 00:24:11.426 ], 00:24:11.426 "product_name": "passthru", 00:24:11.426 "block_size": 4096, 00:24:11.426 "num_blocks": 8192, 00:24:11.426 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:11.426 "md_size": 32, 00:24:11.426 "md_interleave": false, 00:24:11.426 "dif_type": 0, 00:24:11.426 "assigned_rate_limits": { 00:24:11.426 "rw_ios_per_sec": 0, 00:24:11.426 "rw_mbytes_per_sec": 0, 00:24:11.426 "r_mbytes_per_sec": 0, 00:24:11.426 "w_mbytes_per_sec": 0 00:24:11.426 }, 00:24:11.426 "claimed": true, 00:24:11.426 "claim_type": "exclusive_write", 00:24:11.426 "zoned": false, 00:24:11.426 "supported_io_types": { 00:24:11.426 "read": true, 00:24:11.426 "write": true, 00:24:11.426 "unmap": true, 00:24:11.426 "write_zeroes": true, 00:24:11.426 "flush": true, 00:24:11.426 "reset": true, 00:24:11.426 "compare": false, 00:24:11.426 "compare_and_write": false, 00:24:11.426 "abort": true, 00:24:11.426 "nvme_admin": false, 00:24:11.426 "nvme_io": false 00:24:11.426 }, 00:24:11.426 "memory_domains": [ 00:24:11.426 { 00:24:11.426 "dma_device_id": "system", 00:24:11.426 "dma_device_type": 1 00:24:11.426 }, 00:24:11.426 { 00:24:11.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:11.426 "dma_device_type": 2 00:24:11.426 } 00:24:11.426 ], 00:24:11.426 "driver_specific": { 00:24:11.426 "passthru": { 00:24:11.426 "name": "pt1", 00:24:11.426 "base_bdev_name": "malloc1" 00:24:11.426 } 00:24:11.426 } 00:24:11.426 }' 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:11.426 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:11.686 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:11.945 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:11.945 "name": "pt2", 00:24:11.945 "aliases": [ 00:24:11.945 "6f6139a8-090e-50a8-aa56-5cb4f271a5df" 00:24:11.945 ], 00:24:11.945 "product_name": "passthru", 00:24:11.945 "block_size": 4096, 00:24:11.945 "num_blocks": 8192, 00:24:11.945 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:11.945 "md_size": 32, 00:24:11.945 "md_interleave": false, 00:24:11.945 "dif_type": 0, 00:24:11.945 "assigned_rate_limits": { 00:24:11.945 "rw_ios_per_sec": 0, 00:24:11.945 "rw_mbytes_per_sec": 0, 00:24:11.945 "r_mbytes_per_sec": 0, 00:24:11.945 "w_mbytes_per_sec": 0 00:24:11.945 }, 00:24:11.945 "claimed": true, 00:24:11.945 "claim_type": "exclusive_write", 00:24:11.945 "zoned": false, 00:24:11.945 "supported_io_types": { 00:24:11.945 "read": true, 00:24:11.945 "write": true, 00:24:11.945 "unmap": true, 00:24:11.945 "write_zeroes": true, 00:24:11.945 "flush": true, 00:24:11.945 "reset": true, 00:24:11.945 "compare": false, 00:24:11.945 "compare_and_write": false, 00:24:11.945 "abort": true, 00:24:11.945 "nvme_admin": false, 00:24:11.945 "nvme_io": false 00:24:11.945 }, 00:24:11.945 "memory_domains": [ 00:24:11.945 { 00:24:11.945 "dma_device_id": "system", 00:24:11.945 "dma_device_type": 1 00:24:11.945 }, 00:24:11.945 { 00:24:11.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:11.945 "dma_device_type": 2 00:24:11.945 } 00:24:11.945 ], 00:24:11.945 "driver_specific": { 00:24:11.945 "passthru": { 00:24:11.945 "name": "pt2", 00:24:11.945 "base_bdev_name": "malloc2" 00:24:11.945 } 00:24:11.945 } 00:24:11.945 }' 00:24:11.945 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:11.945 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:11.945 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:11.945 12:00:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:11.945 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:12.204 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:24:12.462 [2024-05-14 12:00:39.467353] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:12.462 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=58a904a9-a709-42b2-8563-6368a596d3e0 00:24:12.462 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # '[' -z 58a904a9-a709-42b2-8563-6368a596d3e0 ']' 00:24:12.462 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:12.721 [2024-05-14 12:00:39.699733] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:12.721 [2024-05-14 12:00:39.699754] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:12.721 [2024-05-14 12:00:39.699814] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:12.721 [2024-05-14 12:00:39.699869] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:12.721 [2024-05-14 12:00:39.699881] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc565e0 name raid_bdev1, state offline 00:24:12.721 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.721 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:24:12.980 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:24:12.980 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:24:12.980 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:24:12.980 12:00:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:13.241 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:24:13.241 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:13.501 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:13.501 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:13.760 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:14.019 [2024-05-14 12:00:40.955030] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:14.019 [2024-05-14 12:00:40.956437] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:14.019 [2024-05-14 12:00:40.956496] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:14.019 [2024-05-14 12:00:40.956539] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:14.019 [2024-05-14 12:00:40.956557] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:14.019 [2024-05-14 12:00:40.956567] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc55c20 name raid_bdev1, state configuring 00:24:14.019 request: 00:24:14.019 { 00:24:14.019 "name": "raid_bdev1", 00:24:14.019 "raid_level": "raid1", 00:24:14.019 "base_bdevs": [ 00:24:14.019 "malloc1", 00:24:14.019 "malloc2" 00:24:14.019 ], 00:24:14.019 "superblock": false, 00:24:14.019 "method": "bdev_raid_create", 00:24:14.019 "req_id": 1 00:24:14.019 } 00:24:14.019 Got JSON-RPC error response 00:24:14.019 response: 00:24:14.019 { 00:24:14.019 "code": -17, 00:24:14.019 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:14.019 } 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.019 12:00:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:24:14.278 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:24:14.278 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:24:14.278 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:14.535 [2024-05-14 12:00:41.440245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:14.535 [2024-05-14 12:00:41.440288] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:14.535 [2024-05-14 12:00:41.440307] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb457e0 00:24:14.535 [2024-05-14 12:00:41.440320] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:14.535 [2024-05-14 12:00:41.441769] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:14.535 [2024-05-14 12:00:41.441796] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:14.535 [2024-05-14 12:00:41.441841] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:24:14.535 [2024-05-14 12:00:41.441866] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:14.535 pt1 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.535 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.794 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:14.794 "name": "raid_bdev1", 00:24:14.794 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:14.794 "strip_size_kb": 0, 00:24:14.794 "state": "configuring", 00:24:14.794 "raid_level": "raid1", 00:24:14.794 "superblock": true, 00:24:14.794 "num_base_bdevs": 2, 00:24:14.794 "num_base_bdevs_discovered": 1, 00:24:14.794 "num_base_bdevs_operational": 2, 00:24:14.794 "base_bdevs_list": [ 00:24:14.794 { 00:24:14.794 "name": "pt1", 00:24:14.794 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:14.794 "is_configured": true, 00:24:14.794 "data_offset": 256, 00:24:14.794 "data_size": 7936 00:24:14.794 }, 00:24:14.794 { 00:24:14.794 "name": null, 00:24:14.794 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:14.794 "is_configured": false, 00:24:14.794 "data_offset": 256, 00:24:14.794 "data_size": 7936 00:24:14.794 } 00:24:14.794 ] 00:24:14.794 }' 00:24:14.794 12:00:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:14.794 12:00:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:15.362 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:24:15.362 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:24:15.362 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:24:15.362 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:15.621 [2024-05-14 12:00:42.511083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:15.621 [2024-05-14 12:00:42.511136] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.621 [2024-05-14 12:00:42.511156] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb449e0 00:24:15.621 [2024-05-14 12:00:42.511168] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.621 [2024-05-14 12:00:42.511357] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.621 [2024-05-14 12:00:42.511373] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:15.621 [2024-05-14 12:00:42.511428] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:24:15.621 [2024-05-14 12:00:42.511449] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:15.621 [2024-05-14 12:00:42.511542] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4be00 00:24:15.621 [2024-05-14 12:00:42.511553] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:15.621 [2024-05-14 12:00:42.511608] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4d250 00:24:15.621 [2024-05-14 12:00:42.511714] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4be00 00:24:15.621 [2024-05-14 12:00:42.511723] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4be00 00:24:15.621 [2024-05-14 12:00:42.511793] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:15.621 pt2 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.621 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.880 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:15.880 "name": "raid_bdev1", 00:24:15.880 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:15.880 "strip_size_kb": 0, 00:24:15.880 "state": "online", 00:24:15.880 "raid_level": "raid1", 00:24:15.880 "superblock": true, 00:24:15.880 "num_base_bdevs": 2, 00:24:15.880 "num_base_bdevs_discovered": 2, 00:24:15.880 "num_base_bdevs_operational": 2, 00:24:15.880 "base_bdevs_list": [ 00:24:15.880 { 00:24:15.880 "name": "pt1", 00:24:15.880 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:15.880 "is_configured": true, 00:24:15.880 "data_offset": 256, 00:24:15.880 "data_size": 7936 00:24:15.880 }, 00:24:15.880 { 00:24:15.880 "name": "pt2", 00:24:15.880 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:15.880 "is_configured": true, 00:24:15.880 "data_offset": 256, 00:24:15.880 "data_size": 7936 00:24:15.880 } 00:24:15.880 ] 00:24:15.880 }' 00:24:15.880 12:00:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:15.880 12:00:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:16.446 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:16.705 [2024-05-14 12:00:43.586157] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:16.705 "name": "raid_bdev1", 00:24:16.705 "aliases": [ 00:24:16.705 "58a904a9-a709-42b2-8563-6368a596d3e0" 00:24:16.705 ], 00:24:16.705 "product_name": "Raid Volume", 00:24:16.705 "block_size": 4096, 00:24:16.705 "num_blocks": 7936, 00:24:16.705 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:16.705 "md_size": 32, 00:24:16.705 "md_interleave": false, 00:24:16.705 "dif_type": 0, 00:24:16.705 "assigned_rate_limits": { 00:24:16.705 "rw_ios_per_sec": 0, 00:24:16.705 "rw_mbytes_per_sec": 0, 00:24:16.705 "r_mbytes_per_sec": 0, 00:24:16.705 "w_mbytes_per_sec": 0 00:24:16.705 }, 00:24:16.705 "claimed": false, 00:24:16.705 "zoned": false, 00:24:16.705 "supported_io_types": { 00:24:16.705 "read": true, 00:24:16.705 "write": true, 00:24:16.705 "unmap": false, 00:24:16.705 "write_zeroes": true, 00:24:16.705 "flush": false, 00:24:16.705 "reset": true, 00:24:16.705 "compare": false, 00:24:16.705 "compare_and_write": false, 00:24:16.705 "abort": false, 00:24:16.705 "nvme_admin": false, 00:24:16.705 "nvme_io": false 00:24:16.705 }, 00:24:16.705 "memory_domains": [ 00:24:16.705 { 00:24:16.705 "dma_device_id": "system", 00:24:16.705 "dma_device_type": 1 00:24:16.705 }, 00:24:16.705 { 00:24:16.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:16.705 "dma_device_type": 2 00:24:16.705 }, 00:24:16.705 { 00:24:16.705 "dma_device_id": "system", 00:24:16.705 "dma_device_type": 1 00:24:16.705 }, 00:24:16.705 { 00:24:16.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:16.705 "dma_device_type": 2 00:24:16.705 } 00:24:16.705 ], 00:24:16.705 "driver_specific": { 00:24:16.705 "raid": { 00:24:16.705 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:16.705 "strip_size_kb": 0, 00:24:16.705 "state": "online", 00:24:16.705 "raid_level": "raid1", 00:24:16.705 "superblock": true, 00:24:16.705 "num_base_bdevs": 2, 00:24:16.705 "num_base_bdevs_discovered": 2, 00:24:16.705 "num_base_bdevs_operational": 2, 00:24:16.705 "base_bdevs_list": [ 00:24:16.705 { 00:24:16.705 "name": "pt1", 00:24:16.705 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:16.705 "is_configured": true, 00:24:16.705 "data_offset": 256, 00:24:16.705 "data_size": 7936 00:24:16.705 }, 00:24:16.705 { 00:24:16.705 "name": "pt2", 00:24:16.705 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:16.705 "is_configured": true, 00:24:16.705 "data_offset": 256, 00:24:16.705 "data_size": 7936 00:24:16.705 } 00:24:16.705 ] 00:24:16.705 } 00:24:16.705 } 00:24:16.705 }' 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:24:16.705 pt2' 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:16.705 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:16.964 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:16.964 "name": "pt1", 00:24:16.964 "aliases": [ 00:24:16.964 "a8b1b994-b58a-5960-a742-add925e7783c" 00:24:16.964 ], 00:24:16.964 "product_name": "passthru", 00:24:16.964 "block_size": 4096, 00:24:16.964 "num_blocks": 8192, 00:24:16.964 "uuid": "a8b1b994-b58a-5960-a742-add925e7783c", 00:24:16.964 "md_size": 32, 00:24:16.964 "md_interleave": false, 00:24:16.964 "dif_type": 0, 00:24:16.964 "assigned_rate_limits": { 00:24:16.964 "rw_ios_per_sec": 0, 00:24:16.964 "rw_mbytes_per_sec": 0, 00:24:16.964 "r_mbytes_per_sec": 0, 00:24:16.964 "w_mbytes_per_sec": 0 00:24:16.964 }, 00:24:16.964 "claimed": true, 00:24:16.964 "claim_type": "exclusive_write", 00:24:16.964 "zoned": false, 00:24:16.964 "supported_io_types": { 00:24:16.964 "read": true, 00:24:16.964 "write": true, 00:24:16.964 "unmap": true, 00:24:16.964 "write_zeroes": true, 00:24:16.964 "flush": true, 00:24:16.964 "reset": true, 00:24:16.964 "compare": false, 00:24:16.964 "compare_and_write": false, 00:24:16.964 "abort": true, 00:24:16.964 "nvme_admin": false, 00:24:16.964 "nvme_io": false 00:24:16.964 }, 00:24:16.964 "memory_domains": [ 00:24:16.964 { 00:24:16.964 "dma_device_id": "system", 00:24:16.964 "dma_device_type": 1 00:24:16.964 }, 00:24:16.964 { 00:24:16.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:16.964 "dma_device_type": 2 00:24:16.964 } 00:24:16.964 ], 00:24:16.964 "driver_specific": { 00:24:16.964 "passthru": { 00:24:16.964 "name": "pt1", 00:24:16.964 "base_bdev_name": "malloc1" 00:24:16.964 } 00:24:16.964 } 00:24:16.964 }' 00:24:16.964 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:16.964 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:16.964 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:16.964 12:00:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:16.964 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:17.222 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:24:17.481 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:24:17.481 "name": "pt2", 00:24:17.481 "aliases": [ 00:24:17.481 "6f6139a8-090e-50a8-aa56-5cb4f271a5df" 00:24:17.481 ], 00:24:17.481 "product_name": "passthru", 00:24:17.481 "block_size": 4096, 00:24:17.481 "num_blocks": 8192, 00:24:17.481 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:17.481 "md_size": 32, 00:24:17.481 "md_interleave": false, 00:24:17.481 "dif_type": 0, 00:24:17.481 "assigned_rate_limits": { 00:24:17.481 "rw_ios_per_sec": 0, 00:24:17.481 "rw_mbytes_per_sec": 0, 00:24:17.481 "r_mbytes_per_sec": 0, 00:24:17.481 "w_mbytes_per_sec": 0 00:24:17.481 }, 00:24:17.481 "claimed": true, 00:24:17.481 "claim_type": "exclusive_write", 00:24:17.481 "zoned": false, 00:24:17.481 "supported_io_types": { 00:24:17.481 "read": true, 00:24:17.481 "write": true, 00:24:17.481 "unmap": true, 00:24:17.481 "write_zeroes": true, 00:24:17.481 "flush": true, 00:24:17.481 "reset": true, 00:24:17.481 "compare": false, 00:24:17.481 "compare_and_write": false, 00:24:17.481 "abort": true, 00:24:17.481 "nvme_admin": false, 00:24:17.481 "nvme_io": false 00:24:17.481 }, 00:24:17.481 "memory_domains": [ 00:24:17.481 { 00:24:17.481 "dma_device_id": "system", 00:24:17.481 "dma_device_type": 1 00:24:17.481 }, 00:24:17.481 { 00:24:17.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.481 "dma_device_type": 2 00:24:17.481 } 00:24:17.481 ], 00:24:17.481 "driver_specific": { 00:24:17.481 "passthru": { 00:24:17.481 "name": "pt2", 00:24:17.481 "base_bdev_name": "malloc2" 00:24:17.481 } 00:24:17.481 } 00:24:17.481 }' 00:24:17.481 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:17.481 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:17.739 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:24:17.998 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:24:17.998 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:24:17.998 12:00:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:17.998 [2024-05-14 12:00:45.054039] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:17.998 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # '[' 58a904a9-a709-42b2-8563-6368a596d3e0 '!=' 58a904a9-a709-42b2-8563-6368a596d3e0 ']' 00:24:17.998 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:24:17.998 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:24:17.998 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:24:17.998 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:18.255 [2024-05-14 12:00:45.302496] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.255 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.512 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:18.512 "name": "raid_bdev1", 00:24:18.512 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:18.512 "strip_size_kb": 0, 00:24:18.512 "state": "online", 00:24:18.512 "raid_level": "raid1", 00:24:18.512 "superblock": true, 00:24:18.512 "num_base_bdevs": 2, 00:24:18.512 "num_base_bdevs_discovered": 1, 00:24:18.512 "num_base_bdevs_operational": 1, 00:24:18.512 "base_bdevs_list": [ 00:24:18.512 { 00:24:18.512 "name": null, 00:24:18.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.512 "is_configured": false, 00:24:18.512 "data_offset": 256, 00:24:18.512 "data_size": 7936 00:24:18.512 }, 00:24:18.512 { 00:24:18.512 "name": "pt2", 00:24:18.512 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:18.512 "is_configured": true, 00:24:18.512 "data_offset": 256, 00:24:18.512 "data_size": 7936 00:24:18.512 } 00:24:18.512 ] 00:24:18.512 }' 00:24:18.512 12:00:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:18.512 12:00:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:19.447 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:19.447 [2024-05-14 12:00:46.385330] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:19.447 [2024-05-14 12:00:46.385355] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:19.447 [2024-05-14 12:00:46.385418] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:19.447 [2024-05-14 12:00:46.385469] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:19.447 [2024-05-14 12:00:46.385480] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4be00 name raid_bdev1, state offline 00:24:19.447 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.447 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:24:19.705 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:24:19.705 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:24:19.705 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:24:19.705 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:19.705 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # i=1 00:24:19.964 12:00:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:19.964 [2024-05-14 12:00:47.031070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:19.964 [2024-05-14 12:00:47.031114] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.964 [2024-05-14 12:00:47.031134] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc552b0 00:24:19.964 [2024-05-14 12:00:47.031147] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.964 [2024-05-14 12:00:47.032648] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.964 [2024-05-14 12:00:47.032675] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:19.964 [2024-05-14 12:00:47.032722] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:24:19.964 [2024-05-14 12:00:47.032749] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:19.964 [2024-05-14 12:00:47.032828] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4c560 00:24:19.964 [2024-05-14 12:00:47.032838] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:19.964 [2024-05-14 12:00:47.032897] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb450b0 00:24:19.964 [2024-05-14 12:00:47.032998] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4c560 00:24:19.964 [2024-05-14 12:00:47.033007] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4c560 00:24:19.964 [2024-05-14 12:00:47.033075] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.964 pt2 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:20.223 "name": "raid_bdev1", 00:24:20.223 "uuid": "58a904a9-a709-42b2-8563-6368a596d3e0", 00:24:20.223 "strip_size_kb": 0, 00:24:20.223 "state": "online", 00:24:20.223 "raid_level": "raid1", 00:24:20.223 "superblock": true, 00:24:20.223 "num_base_bdevs": 2, 00:24:20.223 "num_base_bdevs_discovered": 1, 00:24:20.223 "num_base_bdevs_operational": 1, 00:24:20.223 "base_bdevs_list": [ 00:24:20.223 { 00:24:20.223 "name": null, 00:24:20.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:20.223 "is_configured": false, 00:24:20.223 "data_offset": 256, 00:24:20.223 "data_size": 7936 00:24:20.223 }, 00:24:20.223 { 00:24:20.223 "name": "pt2", 00:24:20.223 "uuid": "6f6139a8-090e-50a8-aa56-5cb4f271a5df", 00:24:20.223 "is_configured": true, 00:24:20.223 "data_offset": 256, 00:24:20.223 "data_size": 7936 00:24:20.223 } 00:24:20.223 ] 00:24:20.223 }' 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:20.223 12:00:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:20.790 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:24:20.790 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:20.790 12:00:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:24:21.063 [2024-05-14 12:00:48.013874] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # '[' 58a904a9-a709-42b2-8563-6368a596d3e0 '!=' 58a904a9-a709-42b2-8563-6368a596d3e0 ']' 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@568 -- # killprocess 1792591 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@946 -- # '[' -z 1792591 ']' 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # kill -0 1792591 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # uname 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1792591 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1792591' 00:24:21.063 killing process with pid 1792591 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@965 -- # kill 1792591 00:24:21.063 [2024-05-14 12:00:48.080006] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:21.063 [2024-05-14 12:00:48.080066] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:21.063 [2024-05-14 12:00:48.080114] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:21.063 [2024-05-14 12:00:48.080126] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4c560 name raid_bdev1, state offline 00:24:21.063 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@970 -- # wait 1792591 00:24:21.063 [2024-05-14 12:00:48.102408] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:21.326 12:00:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # return 0 00:24:21.326 00:24:21.326 real 0m13.611s 00:24:21.326 user 0m24.570s 00:24:21.326 sys 0m2.527s 00:24:21.326 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:21.326 12:00:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:21.326 ************************************ 00:24:21.326 END TEST raid_superblock_test_md_separate 00:24:21.326 ************************************ 00:24:21.326 12:00:48 bdev_raid -- bdev/bdev_raid.sh@853 -- # '[' true = true ']' 00:24:21.326 12:00:48 bdev_raid -- bdev/bdev_raid.sh@854 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:24:21.326 12:00:48 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:24:21.326 12:00:48 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:21.326 12:00:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:21.326 ************************************ 00:24:21.326 START TEST raid_rebuild_test_sb_md_separate 00:24:21.326 ************************************ 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local verify=true 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # local strip_size 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@582 -- # local create_arg 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local data_offset 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # raid_pid=1794623 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@603 -- # waitforlisten 1794623 /var/tmp/spdk-raid.sock 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 1794623 ']' 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:21.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:21.326 12:00:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:21.586 [2024-05-14 12:00:48.434983] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:24:21.586 [2024-05-14 12:00:48.435026] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1794623 ] 00:24:21.586 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:21.586 Zero copy mechanism will not be used. 00:24:21.586 [2024-05-14 12:00:48.545886] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:21.586 [2024-05-14 12:00:48.652067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:21.846 [2024-05-14 12:00:48.718929] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:21.846 [2024-05-14 12:00:48.718974] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:22.412 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:22.412 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:24:22.412 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:24:22.412 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:24:22.670 BaseBdev1_malloc 00:24:22.670 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:22.928 [2024-05-14 12:00:49.858976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:22.928 [2024-05-14 12:00:49.859025] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.928 [2024-05-14 12:00:49.859047] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8863f0 00:24:22.928 [2024-05-14 12:00:49.859060] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.928 [2024-05-14 12:00:49.860556] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.928 [2024-05-14 12:00:49.860583] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:22.928 BaseBdev1 00:24:22.928 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:24:22.928 12:00:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:24:23.186 BaseBdev2_malloc 00:24:23.186 12:00:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:23.444 [2024-05-14 12:00:50.341734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:23.444 [2024-05-14 12:00:50.341777] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.444 [2024-05-14 12:00:50.341798] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9dd1c0 00:24:23.444 [2024-05-14 12:00:50.341810] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.444 [2024-05-14 12:00:50.343248] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.444 [2024-05-14 12:00:50.343275] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:23.444 BaseBdev2 00:24:23.444 12:00:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:24:23.703 spare_malloc 00:24:23.703 12:00:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:23.962 spare_delay 00:24:23.962 12:00:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:24.222 [2024-05-14 12:00:51.049005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:24.222 [2024-05-14 12:00:51.049055] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:24.222 [2024-05-14 12:00:51.049077] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x98c5f0 00:24:24.222 [2024-05-14 12:00:51.049090] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:24.222 [2024-05-14 12:00:51.050574] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:24.222 [2024-05-14 12:00:51.050600] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:24.222 spare 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:24.222 [2024-05-14 12:00:51.281655] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:24.222 [2024-05-14 12:00:51.283007] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:24.222 [2024-05-14 12:00:51.283180] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x98e580 00:24:24.222 [2024-05-14 12:00:51.283193] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:24.222 [2024-05-14 12:00:51.283269] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8847d0 00:24:24.222 [2024-05-14 12:00:51.283386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x98e580 00:24:24.222 [2024-05-14 12:00:51.283395] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x98e580 00:24:24.222 [2024-05-14 12:00:51.283474] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.222 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.482 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:24.482 "name": "raid_bdev1", 00:24:24.482 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:24.482 "strip_size_kb": 0, 00:24:24.482 "state": "online", 00:24:24.482 "raid_level": "raid1", 00:24:24.482 "superblock": true, 00:24:24.482 "num_base_bdevs": 2, 00:24:24.482 "num_base_bdevs_discovered": 2, 00:24:24.482 "num_base_bdevs_operational": 2, 00:24:24.482 "base_bdevs_list": [ 00:24:24.482 { 00:24:24.482 "name": "BaseBdev1", 00:24:24.482 "uuid": "9141472e-8f00-5f13-9cea-9137b510d2e1", 00:24:24.482 "is_configured": true, 00:24:24.482 "data_offset": 256, 00:24:24.482 "data_size": 7936 00:24:24.482 }, 00:24:24.482 { 00:24:24.482 "name": "BaseBdev2", 00:24:24.482 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:24.482 "is_configured": true, 00:24:24.482 "data_offset": 256, 00:24:24.482 "data_size": 7936 00:24:24.482 } 00:24:24.482 ] 00:24:24.482 }' 00:24:24.482 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:24.482 12:00:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:25.049 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:25.049 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:24:25.308 [2024-05-14 12:00:52.344699] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:25.308 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:24:25.308 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.308 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@629 -- # '[' true = true ']' 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@630 -- # local write_unit_size 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@633 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:25.567 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:25.825 [2024-05-14 12:00:52.841839] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98e400 00:24:25.825 /dev/nbd0 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:25.825 1+0 records in 00:24:25.825 1+0 records out 00:24:25.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259103 s, 15.8 MB/s 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # '[' raid1 = raid5f ']' 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@638 -- # write_unit_size=1 00:24:25.825 12:00:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:26.762 7936+0 records in 00:24:26.762 7936+0 records out 00:24:26.762 32505856 bytes (33 MB, 31 MiB) copied, 0.743867 s, 43.7 MB/s 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@641 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:26.762 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:27.021 [2024-05-14 12:00:53.912028] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:27.021 12:00:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:27.301 [2024-05-14 12:00:54.152715] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.301 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.560 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:27.560 "name": "raid_bdev1", 00:24:27.560 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:27.560 "strip_size_kb": 0, 00:24:27.560 "state": "online", 00:24:27.560 "raid_level": "raid1", 00:24:27.560 "superblock": true, 00:24:27.560 "num_base_bdevs": 2, 00:24:27.560 "num_base_bdevs_discovered": 1, 00:24:27.560 "num_base_bdevs_operational": 1, 00:24:27.560 "base_bdevs_list": [ 00:24:27.560 { 00:24:27.560 "name": null, 00:24:27.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.560 "is_configured": false, 00:24:27.560 "data_offset": 256, 00:24:27.560 "data_size": 7936 00:24:27.560 }, 00:24:27.560 { 00:24:27.560 "name": "BaseBdev2", 00:24:27.560 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:27.560 "is_configured": true, 00:24:27.560 "data_offset": 256, 00:24:27.560 "data_size": 7936 00:24:27.560 } 00:24:27.560 ] 00:24:27.560 }' 00:24:27.560 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:27.560 12:00:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:28.128 12:00:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:28.387 [2024-05-14 12:00:55.243840] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:28.387 [2024-05-14 12:00:55.246143] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x884730 00:24:28.387 [2024-05-14 12:00:55.248454] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:28.387 12:00:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # sleep 1 00:24:29.322 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.322 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:29.322 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:29.322 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:29.323 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:29.323 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.323 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:29.582 "name": "raid_bdev1", 00:24:29.582 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:29.582 "strip_size_kb": 0, 00:24:29.582 "state": "online", 00:24:29.582 "raid_level": "raid1", 00:24:29.582 "superblock": true, 00:24:29.582 "num_base_bdevs": 2, 00:24:29.582 "num_base_bdevs_discovered": 2, 00:24:29.582 "num_base_bdevs_operational": 2, 00:24:29.582 "process": { 00:24:29.582 "type": "rebuild", 00:24:29.582 "target": "spare", 00:24:29.582 "progress": { 00:24:29.582 "blocks": 2816, 00:24:29.582 "percent": 35 00:24:29.582 } 00:24:29.582 }, 00:24:29.582 "base_bdevs_list": [ 00:24:29.582 { 00:24:29.582 "name": "spare", 00:24:29.582 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:29.582 "is_configured": true, 00:24:29.582 "data_offset": 256, 00:24:29.582 "data_size": 7936 00:24:29.582 }, 00:24:29.582 { 00:24:29.582 "name": "BaseBdev2", 00:24:29.582 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:29.582 "is_configured": true, 00:24:29.582 "data_offset": 256, 00:24:29.582 "data_size": 7936 00:24:29.582 } 00:24:29.582 ] 00:24:29.582 }' 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.582 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:29.842 [2024-05-14 12:00:56.750461] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:29.842 [2024-05-14 12:00:56.760262] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:29.842 [2024-05-14 12:00:56.760307] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.842 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:29.842 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.843 12:00:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.103 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:30.103 "name": "raid_bdev1", 00:24:30.103 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:30.103 "strip_size_kb": 0, 00:24:30.103 "state": "online", 00:24:30.103 "raid_level": "raid1", 00:24:30.103 "superblock": true, 00:24:30.103 "num_base_bdevs": 2, 00:24:30.103 "num_base_bdevs_discovered": 1, 00:24:30.103 "num_base_bdevs_operational": 1, 00:24:30.103 "base_bdevs_list": [ 00:24:30.103 { 00:24:30.103 "name": null, 00:24:30.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.103 "is_configured": false, 00:24:30.103 "data_offset": 256, 00:24:30.103 "data_size": 7936 00:24:30.103 }, 00:24:30.103 { 00:24:30.103 "name": "BaseBdev2", 00:24:30.103 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:30.103 "is_configured": true, 00:24:30.103 "data_offset": 256, 00:24:30.103 "data_size": 7936 00:24:30.103 } 00:24:30.103 ] 00:24:30.103 }' 00:24:30.103 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:30.103 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.674 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:30.933 "name": "raid_bdev1", 00:24:30.933 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:30.933 "strip_size_kb": 0, 00:24:30.933 "state": "online", 00:24:30.933 "raid_level": "raid1", 00:24:30.933 "superblock": true, 00:24:30.933 "num_base_bdevs": 2, 00:24:30.933 "num_base_bdevs_discovered": 1, 00:24:30.933 "num_base_bdevs_operational": 1, 00:24:30.933 "base_bdevs_list": [ 00:24:30.933 { 00:24:30.933 "name": null, 00:24:30.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.933 "is_configured": false, 00:24:30.933 "data_offset": 256, 00:24:30.933 "data_size": 7936 00:24:30.933 }, 00:24:30.933 { 00:24:30.933 "name": "BaseBdev2", 00:24:30.933 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:30.933 "is_configured": true, 00:24:30.933 "data_offset": 256, 00:24:30.933 "data_size": 7936 00:24:30.933 } 00:24:30.933 ] 00:24:30.933 }' 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:30.933 12:00:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:31.192 [2024-05-14 12:00:58.215279] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:31.192 [2024-05-14 12:00:58.217567] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98e490 00:24:31.192 [2024-05-14 12:00:58.219058] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:31.192 12:00:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # sleep 1 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:32.569 "name": "raid_bdev1", 00:24:32.569 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:32.569 "strip_size_kb": 0, 00:24:32.569 "state": "online", 00:24:32.569 "raid_level": "raid1", 00:24:32.569 "superblock": true, 00:24:32.569 "num_base_bdevs": 2, 00:24:32.569 "num_base_bdevs_discovered": 2, 00:24:32.569 "num_base_bdevs_operational": 2, 00:24:32.569 "process": { 00:24:32.569 "type": "rebuild", 00:24:32.569 "target": "spare", 00:24:32.569 "progress": { 00:24:32.569 "blocks": 3072, 00:24:32.569 "percent": 38 00:24:32.569 } 00:24:32.569 }, 00:24:32.569 "base_bdevs_list": [ 00:24:32.569 { 00:24:32.569 "name": "spare", 00:24:32.569 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:32.569 "is_configured": true, 00:24:32.569 "data_offset": 256, 00:24:32.569 "data_size": 7936 00:24:32.569 }, 00:24:32.569 { 00:24:32.569 "name": "BaseBdev2", 00:24:32.569 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:32.569 "is_configured": true, 00:24:32.569 "data_offset": 256, 00:24:32.569 "data_size": 7936 00:24:32.569 } 00:24:32.569 ] 00:24:32.569 }' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:24:32.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@711 -- # local timeout=915 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:32.569 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:32.570 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.570 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.951 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:32.951 "name": "raid_bdev1", 00:24:32.951 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:32.951 "strip_size_kb": 0, 00:24:32.951 "state": "online", 00:24:32.951 "raid_level": "raid1", 00:24:32.951 "superblock": true, 00:24:32.951 "num_base_bdevs": 2, 00:24:32.951 "num_base_bdevs_discovered": 2, 00:24:32.951 "num_base_bdevs_operational": 2, 00:24:32.951 "process": { 00:24:32.951 "type": "rebuild", 00:24:32.951 "target": "spare", 00:24:32.951 "progress": { 00:24:32.951 "blocks": 3840, 00:24:32.951 "percent": 48 00:24:32.951 } 00:24:32.951 }, 00:24:32.951 "base_bdevs_list": [ 00:24:32.952 { 00:24:32.952 "name": "spare", 00:24:32.952 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:32.952 "is_configured": true, 00:24:32.952 "data_offset": 256, 00:24:32.952 "data_size": 7936 00:24:32.952 }, 00:24:32.952 { 00:24:32.952 "name": "BaseBdev2", 00:24:32.952 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:32.952 "is_configured": true, 00:24:32.952 "data_offset": 256, 00:24:32.952 "data_size": 7936 00:24:32.952 } 00:24:32.952 ] 00:24:32.952 }' 00:24:32.952 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:32.952 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.952 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:32.952 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.952 12:00:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@716 -- # sleep 1 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.886 12:01:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:34.145 "name": "raid_bdev1", 00:24:34.145 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:34.145 "strip_size_kb": 0, 00:24:34.145 "state": "online", 00:24:34.145 "raid_level": "raid1", 00:24:34.145 "superblock": true, 00:24:34.145 "num_base_bdevs": 2, 00:24:34.145 "num_base_bdevs_discovered": 2, 00:24:34.145 "num_base_bdevs_operational": 2, 00:24:34.145 "process": { 00:24:34.145 "type": "rebuild", 00:24:34.145 "target": "spare", 00:24:34.145 "progress": { 00:24:34.145 "blocks": 7168, 00:24:34.145 "percent": 90 00:24:34.145 } 00:24:34.145 }, 00:24:34.145 "base_bdevs_list": [ 00:24:34.145 { 00:24:34.145 "name": "spare", 00:24:34.145 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:34.145 "is_configured": true, 00:24:34.145 "data_offset": 256, 00:24:34.145 "data_size": 7936 00:24:34.145 }, 00:24:34.145 { 00:24:34.145 "name": "BaseBdev2", 00:24:34.145 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:34.145 "is_configured": true, 00:24:34.145 "data_offset": 256, 00:24:34.145 "data_size": 7936 00:24:34.145 } 00:24:34.145 ] 00:24:34.145 }' 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:34.145 12:01:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@716 -- # sleep 1 00:24:34.403 [2024-05-14 12:01:01.343594] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:34.403 [2024-05-14 12:01:01.343653] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:34.403 [2024-05-14 12:01:01.343733] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:35.339 "name": "raid_bdev1", 00:24:35.339 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:35.339 "strip_size_kb": 0, 00:24:35.339 "state": "online", 00:24:35.339 "raid_level": "raid1", 00:24:35.339 "superblock": true, 00:24:35.339 "num_base_bdevs": 2, 00:24:35.339 "num_base_bdevs_discovered": 2, 00:24:35.339 "num_base_bdevs_operational": 2, 00:24:35.339 "base_bdevs_list": [ 00:24:35.339 { 00:24:35.339 "name": "spare", 00:24:35.339 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:35.339 "is_configured": true, 00:24:35.339 "data_offset": 256, 00:24:35.339 "data_size": 7936 00:24:35.339 }, 00:24:35.339 { 00:24:35.339 "name": "BaseBdev2", 00:24:35.339 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:35.339 "is_configured": true, 00:24:35.339 "data_offset": 256, 00:24:35.339 "data_size": 7936 00:24:35.339 } 00:24:35.339 ] 00:24:35.339 }' 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:35.339 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # break 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.597 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:35.856 "name": "raid_bdev1", 00:24:35.856 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:35.856 "strip_size_kb": 0, 00:24:35.856 "state": "online", 00:24:35.856 "raid_level": "raid1", 00:24:35.856 "superblock": true, 00:24:35.856 "num_base_bdevs": 2, 00:24:35.856 "num_base_bdevs_discovered": 2, 00:24:35.856 "num_base_bdevs_operational": 2, 00:24:35.856 "base_bdevs_list": [ 00:24:35.856 { 00:24:35.856 "name": "spare", 00:24:35.856 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:35.856 "is_configured": true, 00:24:35.856 "data_offset": 256, 00:24:35.856 "data_size": 7936 00:24:35.856 }, 00:24:35.856 { 00:24:35.856 "name": "BaseBdev2", 00:24:35.856 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:35.856 "is_configured": true, 00:24:35.856 "data_offset": 256, 00:24:35.856 "data_size": 7936 00:24:35.856 } 00:24:35.856 ] 00:24:35.856 }' 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.856 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.115 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:36.115 "name": "raid_bdev1", 00:24:36.115 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:36.115 "strip_size_kb": 0, 00:24:36.115 "state": "online", 00:24:36.115 "raid_level": "raid1", 00:24:36.115 "superblock": true, 00:24:36.115 "num_base_bdevs": 2, 00:24:36.115 "num_base_bdevs_discovered": 2, 00:24:36.115 "num_base_bdevs_operational": 2, 00:24:36.115 "base_bdevs_list": [ 00:24:36.115 { 00:24:36.115 "name": "spare", 00:24:36.115 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:36.115 "is_configured": true, 00:24:36.115 "data_offset": 256, 00:24:36.115 "data_size": 7936 00:24:36.115 }, 00:24:36.115 { 00:24:36.115 "name": "BaseBdev2", 00:24:36.115 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:36.115 "is_configured": true, 00:24:36.115 "data_offset": 256, 00:24:36.115 "data_size": 7936 00:24:36.115 } 00:24:36.115 ] 00:24:36.115 }' 00:24:36.115 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:36.115 12:01:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:36.683 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:36.683 [2024-05-14 12:01:03.721619] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:36.683 [2024-05-14 12:01:03.721648] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:36.683 [2024-05-14 12:01:03.721707] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:36.683 [2024-05-14 12:01:03.721765] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:36.683 [2024-05-14 12:01:03.721777] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x98e580 name raid_bdev1, state offline 00:24:36.683 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.683 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # jq length 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@727 -- # '[' true = true ']' 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@728 -- # '[' false = true ']' 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:36.942 12:01:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:37.201 /dev/nbd0 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:37.201 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:37.202 1+0 records in 00:24:37.202 1+0 records out 00:24:37.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231628 s, 17.7 MB/s 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:37.202 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:37.460 /dev/nbd1 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:37.460 1+0 records in 00:24:37.460 1+0 records out 00:24:37.460 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000327235 s, 12.5 MB/s 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:37.460 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@743 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:37.719 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:37.720 12:01:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:24:37.982 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:38.241 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:38.500 [2024-05-14 12:01:05.432331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:38.500 [2024-05-14 12:01:05.432379] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.500 [2024-05-14 12:01:05.432406] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x884bd0 00:24:38.500 [2024-05-14 12:01:05.432419] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.500 [2024-05-14 12:01:05.433869] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.500 [2024-05-14 12:01:05.433898] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:38.500 [2024-05-14 12:01:05.433948] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:38.500 [2024-05-14 12:01:05.433974] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:38.500 BaseBdev1 00:24:38.500 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:24:38.501 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:24:38.501 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:24:38.759 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:39.019 [2024-05-14 12:01:05.857466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:39.019 [2024-05-14 12:01:05.857506] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.019 [2024-05-14 12:01:05.857523] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x990590 00:24:39.019 [2024-05-14 12:01:05.857536] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.019 [2024-05-14 12:01:05.857698] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.019 [2024-05-14 12:01:05.857714] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:39.019 [2024-05-14 12:01:05.857754] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:24:39.019 [2024-05-14 12:01:05.857764] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:24:39.019 [2024-05-14 12:01:05.857775] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:39.019 [2024-05-14 12:01:05.857791] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x991260 name raid_bdev1, state configuring 00:24:39.019 [2024-05-14 12:01:05.857820] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:39.019 BaseBdev2 00:24:39.019 12:01:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:39.019 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:39.278 [2024-05-14 12:01:06.210391] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:39.278 [2024-05-14 12:01:06.210428] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.278 [2024-05-14 12:01:06.210445] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x98c390 00:24:39.278 [2024-05-14 12:01:06.210458] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.278 [2024-05-14 12:01:06.210624] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.278 [2024-05-14 12:01:06.210639] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:39.278 [2024-05-14 12:01:06.210689] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:24:39.278 [2024-05-14 12:01:06.210706] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:39.278 spare 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.278 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.278 [2024-05-14 12:01:06.311029] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x98e9d0 00:24:39.278 [2024-05-14 12:01:06.311045] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:39.278 [2024-05-14 12:01:06.311104] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x990f20 00:24:39.278 [2024-05-14 12:01:06.311220] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x98e9d0 00:24:39.278 [2024-05-14 12:01:06.311230] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x98e9d0 00:24:39.278 [2024-05-14 12:01:06.311302] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.538 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:39.538 "name": "raid_bdev1", 00:24:39.538 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:39.538 "strip_size_kb": 0, 00:24:39.538 "state": "online", 00:24:39.538 "raid_level": "raid1", 00:24:39.538 "superblock": true, 00:24:39.538 "num_base_bdevs": 2, 00:24:39.538 "num_base_bdevs_discovered": 2, 00:24:39.538 "num_base_bdevs_operational": 2, 00:24:39.538 "base_bdevs_list": [ 00:24:39.538 { 00:24:39.538 "name": "spare", 00:24:39.538 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:39.538 "is_configured": true, 00:24:39.538 "data_offset": 256, 00:24:39.538 "data_size": 7936 00:24:39.538 }, 00:24:39.538 { 00:24:39.538 "name": "BaseBdev2", 00:24:39.538 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:39.538 "is_configured": true, 00:24:39.538 "data_offset": 256, 00:24:39.538 "data_size": 7936 00:24:39.538 } 00:24:39.538 ] 00:24:39.538 }' 00:24:39.538 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:39.538 12:01:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.107 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.365 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:40.365 "name": "raid_bdev1", 00:24:40.365 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:40.365 "strip_size_kb": 0, 00:24:40.365 "state": "online", 00:24:40.365 "raid_level": "raid1", 00:24:40.365 "superblock": true, 00:24:40.365 "num_base_bdevs": 2, 00:24:40.365 "num_base_bdevs_discovered": 2, 00:24:40.365 "num_base_bdevs_operational": 2, 00:24:40.365 "base_bdevs_list": [ 00:24:40.365 { 00:24:40.365 "name": "spare", 00:24:40.365 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:40.365 "is_configured": true, 00:24:40.365 "data_offset": 256, 00:24:40.365 "data_size": 7936 00:24:40.365 }, 00:24:40.365 { 00:24:40.366 "name": "BaseBdev2", 00:24:40.366 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:40.366 "is_configured": true, 00:24:40.366 "data_offset": 256, 00:24:40.366 "data_size": 7936 00:24:40.366 } 00:24:40.366 ] 00:24:40.366 }' 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.366 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:40.625 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:24:40.625 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:40.884 [2024-05-14 12:01:07.754603] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.884 12:01:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.145 12:01:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:41.145 "name": "raid_bdev1", 00:24:41.145 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:41.145 "strip_size_kb": 0, 00:24:41.145 "state": "online", 00:24:41.145 "raid_level": "raid1", 00:24:41.145 "superblock": true, 00:24:41.145 "num_base_bdevs": 2, 00:24:41.145 "num_base_bdevs_discovered": 1, 00:24:41.145 "num_base_bdevs_operational": 1, 00:24:41.145 "base_bdevs_list": [ 00:24:41.145 { 00:24:41.145 "name": null, 00:24:41.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.145 "is_configured": false, 00:24:41.145 "data_offset": 256, 00:24:41.145 "data_size": 7936 00:24:41.145 }, 00:24:41.145 { 00:24:41.145 "name": "BaseBdev2", 00:24:41.145 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:41.145 "is_configured": true, 00:24:41.145 "data_offset": 256, 00:24:41.145 "data_size": 7936 00:24:41.145 } 00:24:41.145 ] 00:24:41.145 }' 00:24:41.145 12:01:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:41.145 12:01:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:41.714 12:01:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:41.974 [2024-05-14 12:01:08.833521] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:41.974 [2024-05-14 12:01:08.833667] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:41.974 [2024-05-14 12:01:08.833683] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:41.974 [2024-05-14 12:01:08.833709] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:41.974 [2024-05-14 12:01:08.835866] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98f030 00:24:41.974 [2024-05-14 12:01:08.837250] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:41.974 12:01:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # sleep 1 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.909 12:01:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:43.168 "name": "raid_bdev1", 00:24:43.168 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:43.168 "strip_size_kb": 0, 00:24:43.168 "state": "online", 00:24:43.168 "raid_level": "raid1", 00:24:43.168 "superblock": true, 00:24:43.168 "num_base_bdevs": 2, 00:24:43.168 "num_base_bdevs_discovered": 2, 00:24:43.168 "num_base_bdevs_operational": 2, 00:24:43.168 "process": { 00:24:43.168 "type": "rebuild", 00:24:43.168 "target": "spare", 00:24:43.168 "progress": { 00:24:43.168 "blocks": 3072, 00:24:43.168 "percent": 38 00:24:43.168 } 00:24:43.168 }, 00:24:43.168 "base_bdevs_list": [ 00:24:43.168 { 00:24:43.168 "name": "spare", 00:24:43.168 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:43.168 "is_configured": true, 00:24:43.168 "data_offset": 256, 00:24:43.168 "data_size": 7936 00:24:43.168 }, 00:24:43.168 { 00:24:43.168 "name": "BaseBdev2", 00:24:43.168 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:43.168 "is_configured": true, 00:24:43.168 "data_offset": 256, 00:24:43.168 "data_size": 7936 00:24:43.168 } 00:24:43.168 ] 00:24:43.168 }' 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:43.168 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:43.428 [2024-05-14 12:01:10.430783] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:43.428 [2024-05-14 12:01:10.449661] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:43.428 [2024-05-14 12:01:10.449705] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:43.428 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.429 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.688 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:43.688 "name": "raid_bdev1", 00:24:43.688 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:43.688 "strip_size_kb": 0, 00:24:43.688 "state": "online", 00:24:43.688 "raid_level": "raid1", 00:24:43.688 "superblock": true, 00:24:43.688 "num_base_bdevs": 2, 00:24:43.688 "num_base_bdevs_discovered": 1, 00:24:43.688 "num_base_bdevs_operational": 1, 00:24:43.688 "base_bdevs_list": [ 00:24:43.688 { 00:24:43.688 "name": null, 00:24:43.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.688 "is_configured": false, 00:24:43.688 "data_offset": 256, 00:24:43.688 "data_size": 7936 00:24:43.688 }, 00:24:43.688 { 00:24:43.688 "name": "BaseBdev2", 00:24:43.688 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:43.688 "is_configured": true, 00:24:43.688 "data_offset": 256, 00:24:43.688 "data_size": 7936 00:24:43.688 } 00:24:43.688 ] 00:24:43.688 }' 00:24:43.688 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:43.688 12:01:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:44.256 12:01:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:44.515 [2024-05-14 12:01:11.536269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:44.515 [2024-05-14 12:01:11.536317] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.515 [2024-05-14 12:01:11.536339] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x98ee10 00:24:44.515 [2024-05-14 12:01:11.536351] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.515 [2024-05-14 12:01:11.536575] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.516 [2024-05-14 12:01:11.536593] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:44.516 [2024-05-14 12:01:11.536652] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:24:44.516 [2024-05-14 12:01:11.536664] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:44.516 [2024-05-14 12:01:11.536675] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:44.516 [2024-05-14 12:01:11.536693] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:44.516 [2024-05-14 12:01:11.538868] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x991210 00:24:44.516 [2024-05-14 12:01:11.540220] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:44.516 spare 00:24:44.516 12:01:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # sleep 1 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.501 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.760 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:45.760 "name": "raid_bdev1", 00:24:45.760 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:45.760 "strip_size_kb": 0, 00:24:45.760 "state": "online", 00:24:45.760 "raid_level": "raid1", 00:24:45.760 "superblock": true, 00:24:45.760 "num_base_bdevs": 2, 00:24:45.760 "num_base_bdevs_discovered": 2, 00:24:45.760 "num_base_bdevs_operational": 2, 00:24:45.760 "process": { 00:24:45.760 "type": "rebuild", 00:24:45.760 "target": "spare", 00:24:45.760 "progress": { 00:24:45.760 "blocks": 3072, 00:24:45.760 "percent": 38 00:24:45.760 } 00:24:45.760 }, 00:24:45.760 "base_bdevs_list": [ 00:24:45.760 { 00:24:45.760 "name": "spare", 00:24:45.760 "uuid": "a5e33ce1-62ee-5e9c-b087-a4124028f3e7", 00:24:45.760 "is_configured": true, 00:24:45.760 "data_offset": 256, 00:24:45.760 "data_size": 7936 00:24:45.760 }, 00:24:45.760 { 00:24:45.760 "name": "BaseBdev2", 00:24:45.760 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:45.760 "is_configured": true, 00:24:45.760 "data_offset": 256, 00:24:45.760 "data_size": 7936 00:24:45.760 } 00:24:45.760 ] 00:24:45.760 }' 00:24:45.760 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:46.020 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:46.020 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:46.020 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.020 12:01:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:46.279 [2024-05-14 12:01:13.121247] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:46.279 [2024-05-14 12:01:13.153184] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:46.279 [2024-05-14 12:01:13.153227] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.279 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.540 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:46.540 "name": "raid_bdev1", 00:24:46.540 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:46.540 "strip_size_kb": 0, 00:24:46.540 "state": "online", 00:24:46.540 "raid_level": "raid1", 00:24:46.540 "superblock": true, 00:24:46.540 "num_base_bdevs": 2, 00:24:46.540 "num_base_bdevs_discovered": 1, 00:24:46.540 "num_base_bdevs_operational": 1, 00:24:46.540 "base_bdevs_list": [ 00:24:46.540 { 00:24:46.540 "name": null, 00:24:46.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.540 "is_configured": false, 00:24:46.540 "data_offset": 256, 00:24:46.540 "data_size": 7936 00:24:46.540 }, 00:24:46.540 { 00:24:46.540 "name": "BaseBdev2", 00:24:46.540 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:46.540 "is_configured": true, 00:24:46.540 "data_offset": 256, 00:24:46.540 "data_size": 7936 00:24:46.540 } 00:24:46.540 ] 00:24:46.540 }' 00:24:46.540 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:46.540 12:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.109 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:47.367 "name": "raid_bdev1", 00:24:47.367 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:47.367 "strip_size_kb": 0, 00:24:47.367 "state": "online", 00:24:47.367 "raid_level": "raid1", 00:24:47.367 "superblock": true, 00:24:47.367 "num_base_bdevs": 2, 00:24:47.367 "num_base_bdevs_discovered": 1, 00:24:47.367 "num_base_bdevs_operational": 1, 00:24:47.367 "base_bdevs_list": [ 00:24:47.367 { 00:24:47.367 "name": null, 00:24:47.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.367 "is_configured": false, 00:24:47.367 "data_offset": 256, 00:24:47.367 "data_size": 7936 00:24:47.367 }, 00:24:47.367 { 00:24:47.367 "name": "BaseBdev2", 00:24:47.367 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:47.367 "is_configured": true, 00:24:47.367 "data_offset": 256, 00:24:47.367 "data_size": 7936 00:24:47.367 } 00:24:47.367 ] 00:24:47.367 }' 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:47.367 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:47.626 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:47.884 [2024-05-14 12:01:14.824827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:47.884 [2024-05-14 12:01:14.824868] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.884 [2024-05-14 12:01:14.824886] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8845b0 00:24:47.884 [2024-05-14 12:01:14.824898] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.884 [2024-05-14 12:01:14.825087] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.884 [2024-05-14 12:01:14.825102] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:47.884 [2024-05-14 12:01:14.825147] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:47.884 [2024-05-14 12:01:14.825158] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:47.884 [2024-05-14 12:01:14.825168] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:47.884 BaseBdev1 00:24:47.884 12:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@786 -- # sleep 1 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.819 12:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.078 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:49.078 "name": "raid_bdev1", 00:24:49.078 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:49.078 "strip_size_kb": 0, 00:24:49.078 "state": "online", 00:24:49.078 "raid_level": "raid1", 00:24:49.078 "superblock": true, 00:24:49.078 "num_base_bdevs": 2, 00:24:49.078 "num_base_bdevs_discovered": 1, 00:24:49.078 "num_base_bdevs_operational": 1, 00:24:49.078 "base_bdevs_list": [ 00:24:49.078 { 00:24:49.078 "name": null, 00:24:49.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.078 "is_configured": false, 00:24:49.078 "data_offset": 256, 00:24:49.078 "data_size": 7936 00:24:49.078 }, 00:24:49.078 { 00:24:49.078 "name": "BaseBdev2", 00:24:49.078 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:49.078 "is_configured": true, 00:24:49.078 "data_offset": 256, 00:24:49.078 "data_size": 7936 00:24:49.078 } 00:24:49.078 ] 00:24:49.078 }' 00:24:49.078 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:49.078 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.643 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:49.901 "name": "raid_bdev1", 00:24:49.901 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:49.901 "strip_size_kb": 0, 00:24:49.901 "state": "online", 00:24:49.901 "raid_level": "raid1", 00:24:49.901 "superblock": true, 00:24:49.901 "num_base_bdevs": 2, 00:24:49.901 "num_base_bdevs_discovered": 1, 00:24:49.901 "num_base_bdevs_operational": 1, 00:24:49.901 "base_bdevs_list": [ 00:24:49.901 { 00:24:49.901 "name": null, 00:24:49.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.901 "is_configured": false, 00:24:49.901 "data_offset": 256, 00:24:49.901 "data_size": 7936 00:24:49.901 }, 00:24:49.901 { 00:24:49.901 "name": "BaseBdev2", 00:24:49.901 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:49.901 "is_configured": true, 00:24:49.901 "data_offset": 256, 00:24:49.901 "data_size": 7936 00:24:49.901 } 00:24:49.901 ] 00:24:49.901 }' 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:49.901 12:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:50.159 [2024-05-14 12:01:17.187127] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:50.160 [2024-05-14 12:01:17.187249] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:50.160 [2024-05-14 12:01:17.187264] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:50.160 request: 00:24:50.160 { 00:24:50.160 "raid_bdev": "raid_bdev1", 00:24:50.160 "base_bdev": "BaseBdev1", 00:24:50.160 "method": "bdev_raid_add_base_bdev", 00:24:50.160 "req_id": 1 00:24:50.160 } 00:24:50.160 Got JSON-RPC error response 00:24:50.160 response: 00:24:50.160 { 00:24:50.160 "code": -22, 00:24:50.160 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:50.160 } 00:24:50.160 12:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:24:50.160 12:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:50.160 12:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:50.160 12:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:50.160 12:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # sleep 1 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:51.533 "name": "raid_bdev1", 00:24:51.533 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:51.533 "strip_size_kb": 0, 00:24:51.533 "state": "online", 00:24:51.533 "raid_level": "raid1", 00:24:51.533 "superblock": true, 00:24:51.533 "num_base_bdevs": 2, 00:24:51.533 "num_base_bdevs_discovered": 1, 00:24:51.533 "num_base_bdevs_operational": 1, 00:24:51.533 "base_bdevs_list": [ 00:24:51.533 { 00:24:51.533 "name": null, 00:24:51.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.533 "is_configured": false, 00:24:51.533 "data_offset": 256, 00:24:51.533 "data_size": 7936 00:24:51.533 }, 00:24:51.533 { 00:24:51.533 "name": "BaseBdev2", 00:24:51.533 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:51.533 "is_configured": true, 00:24:51.533 "data_offset": 256, 00:24:51.533 "data_size": 7936 00:24:51.533 } 00:24:51.533 ] 00:24:51.533 }' 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:51.533 12:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:52.100 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:52.100 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:52.100 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:52.100 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:52.100 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:52.101 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.101 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:52.359 "name": "raid_bdev1", 00:24:52.359 "uuid": "d971583e-f426-435b-8ad4-28353a2415c5", 00:24:52.359 "strip_size_kb": 0, 00:24:52.359 "state": "online", 00:24:52.359 "raid_level": "raid1", 00:24:52.359 "superblock": true, 00:24:52.359 "num_base_bdevs": 2, 00:24:52.359 "num_base_bdevs_discovered": 1, 00:24:52.359 "num_base_bdevs_operational": 1, 00:24:52.359 "base_bdevs_list": [ 00:24:52.359 { 00:24:52.359 "name": null, 00:24:52.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.359 "is_configured": false, 00:24:52.359 "data_offset": 256, 00:24:52.359 "data_size": 7936 00:24:52.359 }, 00:24:52.359 { 00:24:52.359 "name": "BaseBdev2", 00:24:52.359 "uuid": "54cc6f5c-5456-51fa-891a-9b67fc80a288", 00:24:52.359 "is_configured": true, 00:24:52.359 "data_offset": 256, 00:24:52.359 "data_size": 7936 00:24:52.359 } 00:24:52.359 ] 00:24:52.359 }' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # killprocess 1794623 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 1794623 ']' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 1794623 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1794623 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1794623' 00:24:52.359 killing process with pid 1794623 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 1794623 00:24:52.359 Received shutdown signal, test time was about 60.000000 seconds 00:24:52.359 00:24:52.359 Latency(us) 00:24:52.359 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:52.359 =================================================================================================================== 00:24:52.359 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:52.359 [2024-05-14 12:01:19.394565] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:52.359 [2024-05-14 12:01:19.394661] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:52.359 [2024-05-14 12:01:19.394705] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:52.359 [2024-05-14 12:01:19.394718] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x98e9d0 name raid_bdev1, state offline 00:24:52.359 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 1794623 00:24:52.360 [2024-05-14 12:01:19.428164] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:52.618 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@797 -- # return 0 00:24:52.618 00:24:52.618 real 0m31.241s 00:24:52.618 user 0m48.817s 00:24:52.618 sys 0m4.957s 00:24:52.618 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:52.618 12:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:52.618 ************************************ 00:24:52.618 END TEST raid_rebuild_test_sb_md_separate 00:24:52.618 ************************************ 00:24:52.618 12:01:19 bdev_raid -- bdev/bdev_raid.sh@857 -- # base_malloc_params='-m 32 -i' 00:24:52.618 12:01:19 bdev_raid -- bdev/bdev_raid.sh@858 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:24:52.618 12:01:19 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:24:52.618 12:01:19 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:52.618 12:01:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:52.877 ************************************ 00:24:52.877 START TEST raid_state_function_test_sb_md_interleaved 00:24:52.877 ************************************ 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # raid_pid=1799123 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 1799123' 00:24:52.877 Process raid pid: 1799123 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@247 -- # waitforlisten 1799123 /var/tmp/spdk-raid.sock 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 1799123 ']' 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:52.877 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:52.877 12:01:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:52.877 [2024-05-14 12:01:19.798488] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:24:52.877 [2024-05-14 12:01:19.798552] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:52.877 [2024-05-14 12:01:19.926030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:53.136 [2024-05-14 12:01:20.030680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:53.136 [2024-05-14 12:01:20.092616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:53.136 [2024-05-14 12:01:20.092651] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:53.701 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:53.701 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:24:53.701 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:53.960 [2024-05-14 12:01:20.945261] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:53.960 [2024-05-14 12:01:20.945305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:53.960 [2024-05-14 12:01:20.945317] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:53.960 [2024-05-14 12:01:20.945329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.960 12:01:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:54.219 12:01:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:54.219 "name": "Existed_Raid", 00:24:54.219 "uuid": "379f1cd5-e550-4e33-be10-46b1a3cde8c4", 00:24:54.219 "strip_size_kb": 0, 00:24:54.219 "state": "configuring", 00:24:54.219 "raid_level": "raid1", 00:24:54.219 "superblock": true, 00:24:54.219 "num_base_bdevs": 2, 00:24:54.219 "num_base_bdevs_discovered": 0, 00:24:54.219 "num_base_bdevs_operational": 2, 00:24:54.219 "base_bdevs_list": [ 00:24:54.219 { 00:24:54.219 "name": "BaseBdev1", 00:24:54.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.219 "is_configured": false, 00:24:54.219 "data_offset": 0, 00:24:54.219 "data_size": 0 00:24:54.219 }, 00:24:54.219 { 00:24:54.219 "name": "BaseBdev2", 00:24:54.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.219 "is_configured": false, 00:24:54.219 "data_offset": 0, 00:24:54.219 "data_size": 0 00:24:54.219 } 00:24:54.219 ] 00:24:54.219 }' 00:24:54.219 12:01:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:54.219 12:01:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:54.786 12:01:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:55.043 [2024-05-14 12:01:21.987858] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:55.043 [2024-05-14 12:01:21.987889] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1376700 name Existed_Raid, state configuring 00:24:55.043 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:55.301 [2024-05-14 12:01:22.168360] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:55.301 [2024-05-14 12:01:22.168389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:55.301 [2024-05-14 12:01:22.168407] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:55.301 [2024-05-14 12:01:22.168420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:55.301 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:24:55.559 [2024-05-14 12:01:22.423029] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:55.559 BaseBdev1 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:55.559 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:55.818 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:56.077 [ 00:24:56.077 { 00:24:56.077 "name": "BaseBdev1", 00:24:56.077 "aliases": [ 00:24:56.077 "caf00f86-cd7c-49bb-a840-079837a154cb" 00:24:56.077 ], 00:24:56.077 "product_name": "Malloc disk", 00:24:56.077 "block_size": 4128, 00:24:56.077 "num_blocks": 8192, 00:24:56.077 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:24:56.077 "md_size": 32, 00:24:56.077 "md_interleave": true, 00:24:56.077 "dif_type": 0, 00:24:56.077 "assigned_rate_limits": { 00:24:56.077 "rw_ios_per_sec": 0, 00:24:56.077 "rw_mbytes_per_sec": 0, 00:24:56.077 "r_mbytes_per_sec": 0, 00:24:56.077 "w_mbytes_per_sec": 0 00:24:56.077 }, 00:24:56.077 "claimed": true, 00:24:56.077 "claim_type": "exclusive_write", 00:24:56.077 "zoned": false, 00:24:56.077 "supported_io_types": { 00:24:56.077 "read": true, 00:24:56.077 "write": true, 00:24:56.077 "unmap": true, 00:24:56.077 "write_zeroes": true, 00:24:56.077 "flush": true, 00:24:56.077 "reset": true, 00:24:56.077 "compare": false, 00:24:56.077 "compare_and_write": false, 00:24:56.077 "abort": true, 00:24:56.077 "nvme_admin": false, 00:24:56.077 "nvme_io": false 00:24:56.077 }, 00:24:56.077 "memory_domains": [ 00:24:56.077 { 00:24:56.077 "dma_device_id": "system", 00:24:56.077 "dma_device_type": 1 00:24:56.077 }, 00:24:56.077 { 00:24:56.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:56.077 "dma_device_type": 2 00:24:56.077 } 00:24:56.077 ], 00:24:56.077 "driver_specific": {} 00:24:56.077 } 00:24:56.077 ] 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.077 12:01:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:56.337 12:01:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:56.337 "name": "Existed_Raid", 00:24:56.337 "uuid": "89df2156-8f85-42fc-91f1-d24ab6661491", 00:24:56.337 "strip_size_kb": 0, 00:24:56.337 "state": "configuring", 00:24:56.337 "raid_level": "raid1", 00:24:56.337 "superblock": true, 00:24:56.337 "num_base_bdevs": 2, 00:24:56.337 "num_base_bdevs_discovered": 1, 00:24:56.337 "num_base_bdevs_operational": 2, 00:24:56.337 "base_bdevs_list": [ 00:24:56.337 { 00:24:56.337 "name": "BaseBdev1", 00:24:56.337 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:24:56.337 "is_configured": true, 00:24:56.337 "data_offset": 256, 00:24:56.337 "data_size": 7936 00:24:56.337 }, 00:24:56.337 { 00:24:56.337 "name": "BaseBdev2", 00:24:56.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.337 "is_configured": false, 00:24:56.337 "data_offset": 0, 00:24:56.337 "data_size": 0 00:24:56.337 } 00:24:56.337 ] 00:24:56.337 }' 00:24:56.337 12:01:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:56.337 12:01:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:56.904 12:01:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:56.904 [2024-05-14 12:01:23.987193] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:56.904 [2024-05-14 12:01:23.987233] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13769a0 name Existed_Raid, state configuring 00:24:57.163 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:57.163 [2024-05-14 12:01:24.231877] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:57.163 [2024-05-14 12:01:24.233350] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:57.163 [2024-05-14 12:01:24.233381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.421 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:57.679 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:57.679 "name": "Existed_Raid", 00:24:57.679 "uuid": "cfec8491-9ea5-4f58-b8b5-3841d612f2f8", 00:24:57.679 "strip_size_kb": 0, 00:24:57.679 "state": "configuring", 00:24:57.679 "raid_level": "raid1", 00:24:57.679 "superblock": true, 00:24:57.679 "num_base_bdevs": 2, 00:24:57.679 "num_base_bdevs_discovered": 1, 00:24:57.679 "num_base_bdevs_operational": 2, 00:24:57.679 "base_bdevs_list": [ 00:24:57.679 { 00:24:57.679 "name": "BaseBdev1", 00:24:57.679 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:24:57.679 "is_configured": true, 00:24:57.679 "data_offset": 256, 00:24:57.679 "data_size": 7936 00:24:57.679 }, 00:24:57.679 { 00:24:57.679 "name": "BaseBdev2", 00:24:57.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.679 "is_configured": false, 00:24:57.679 "data_offset": 0, 00:24:57.679 "data_size": 0 00:24:57.679 } 00:24:57.679 ] 00:24:57.679 }' 00:24:57.679 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:57.679 12:01:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:58.242 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:24:58.242 [2024-05-14 12:01:25.314241] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:58.242 [2024-05-14 12:01:25.314373] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1375ff0 00:24:58.242 [2024-05-14 12:01:25.314387] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:58.242 [2024-05-14 12:01:25.314453] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13758b0 00:24:58.242 [2024-05-14 12:01:25.314534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1375ff0 00:24:58.242 [2024-05-14 12:01:25.314544] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1375ff0 00:24:58.242 [2024-05-14 12:01:25.314605] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.242 BaseBdev2 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:58.518 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:58.776 [ 00:24:58.777 { 00:24:58.777 "name": "BaseBdev2", 00:24:58.777 "aliases": [ 00:24:58.777 "1b682368-a5a5-4de8-aa19-58d001501f29" 00:24:58.777 ], 00:24:58.777 "product_name": "Malloc disk", 00:24:58.777 "block_size": 4128, 00:24:58.777 "num_blocks": 8192, 00:24:58.777 "uuid": "1b682368-a5a5-4de8-aa19-58d001501f29", 00:24:58.777 "md_size": 32, 00:24:58.777 "md_interleave": true, 00:24:58.777 "dif_type": 0, 00:24:58.777 "assigned_rate_limits": { 00:24:58.777 "rw_ios_per_sec": 0, 00:24:58.777 "rw_mbytes_per_sec": 0, 00:24:58.777 "r_mbytes_per_sec": 0, 00:24:58.777 "w_mbytes_per_sec": 0 00:24:58.777 }, 00:24:58.777 "claimed": true, 00:24:58.777 "claim_type": "exclusive_write", 00:24:58.777 "zoned": false, 00:24:58.777 "supported_io_types": { 00:24:58.777 "read": true, 00:24:58.777 "write": true, 00:24:58.777 "unmap": true, 00:24:58.777 "write_zeroes": true, 00:24:58.777 "flush": true, 00:24:58.777 "reset": true, 00:24:58.777 "compare": false, 00:24:58.777 "compare_and_write": false, 00:24:58.777 "abort": true, 00:24:58.777 "nvme_admin": false, 00:24:58.777 "nvme_io": false 00:24:58.777 }, 00:24:58.777 "memory_domains": [ 00:24:58.777 { 00:24:58.777 "dma_device_id": "system", 00:24:58.777 "dma_device_type": 1 00:24:58.777 }, 00:24:58.777 { 00:24:58.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:58.777 "dma_device_type": 2 00:24:58.777 } 00:24:58.777 ], 00:24:58.777 "driver_specific": {} 00:24:58.777 } 00:24:58.777 ] 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.777 12:01:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:59.036 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:59.036 "name": "Existed_Raid", 00:24:59.036 "uuid": "cfec8491-9ea5-4f58-b8b5-3841d612f2f8", 00:24:59.036 "strip_size_kb": 0, 00:24:59.036 "state": "online", 00:24:59.036 "raid_level": "raid1", 00:24:59.036 "superblock": true, 00:24:59.036 "num_base_bdevs": 2, 00:24:59.036 "num_base_bdevs_discovered": 2, 00:24:59.036 "num_base_bdevs_operational": 2, 00:24:59.036 "base_bdevs_list": [ 00:24:59.036 { 00:24:59.036 "name": "BaseBdev1", 00:24:59.036 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:24:59.036 "is_configured": true, 00:24:59.036 "data_offset": 256, 00:24:59.036 "data_size": 7936 00:24:59.036 }, 00:24:59.036 { 00:24:59.036 "name": "BaseBdev2", 00:24:59.036 "uuid": "1b682368-a5a5-4de8-aa19-58d001501f29", 00:24:59.036 "is_configured": true, 00:24:59.036 "data_offset": 256, 00:24:59.036 "data_size": 7936 00:24:59.036 } 00:24:59.036 ] 00:24:59.036 }' 00:24:59.036 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:59.036 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:59.602 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:24:59.860 [2024-05-14 12:01:26.886687] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:59.860 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:24:59.860 "name": "Existed_Raid", 00:24:59.860 "aliases": [ 00:24:59.860 "cfec8491-9ea5-4f58-b8b5-3841d612f2f8" 00:24:59.860 ], 00:24:59.860 "product_name": "Raid Volume", 00:24:59.860 "block_size": 4128, 00:24:59.860 "num_blocks": 7936, 00:24:59.860 "uuid": "cfec8491-9ea5-4f58-b8b5-3841d612f2f8", 00:24:59.860 "md_size": 32, 00:24:59.860 "md_interleave": true, 00:24:59.860 "dif_type": 0, 00:24:59.860 "assigned_rate_limits": { 00:24:59.860 "rw_ios_per_sec": 0, 00:24:59.860 "rw_mbytes_per_sec": 0, 00:24:59.860 "r_mbytes_per_sec": 0, 00:24:59.860 "w_mbytes_per_sec": 0 00:24:59.860 }, 00:24:59.860 "claimed": false, 00:24:59.860 "zoned": false, 00:24:59.860 "supported_io_types": { 00:24:59.860 "read": true, 00:24:59.860 "write": true, 00:24:59.860 "unmap": false, 00:24:59.860 "write_zeroes": true, 00:24:59.860 "flush": false, 00:24:59.860 "reset": true, 00:24:59.860 "compare": false, 00:24:59.860 "compare_and_write": false, 00:24:59.860 "abort": false, 00:24:59.860 "nvme_admin": false, 00:24:59.860 "nvme_io": false 00:24:59.860 }, 00:24:59.860 "memory_domains": [ 00:24:59.860 { 00:24:59.860 "dma_device_id": "system", 00:24:59.860 "dma_device_type": 1 00:24:59.860 }, 00:24:59.860 { 00:24:59.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:59.860 "dma_device_type": 2 00:24:59.860 }, 00:24:59.860 { 00:24:59.860 "dma_device_id": "system", 00:24:59.860 "dma_device_type": 1 00:24:59.860 }, 00:24:59.860 { 00:24:59.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:59.860 "dma_device_type": 2 00:24:59.860 } 00:24:59.860 ], 00:24:59.860 "driver_specific": { 00:24:59.860 "raid": { 00:24:59.860 "uuid": "cfec8491-9ea5-4f58-b8b5-3841d612f2f8", 00:24:59.860 "strip_size_kb": 0, 00:24:59.860 "state": "online", 00:24:59.860 "raid_level": "raid1", 00:24:59.860 "superblock": true, 00:24:59.860 "num_base_bdevs": 2, 00:24:59.860 "num_base_bdevs_discovered": 2, 00:24:59.860 "num_base_bdevs_operational": 2, 00:24:59.860 "base_bdevs_list": [ 00:24:59.860 { 00:24:59.860 "name": "BaseBdev1", 00:24:59.860 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:24:59.860 "is_configured": true, 00:24:59.860 "data_offset": 256, 00:24:59.860 "data_size": 7936 00:24:59.860 }, 00:24:59.860 { 00:24:59.860 "name": "BaseBdev2", 00:24:59.860 "uuid": "1b682368-a5a5-4de8-aa19-58d001501f29", 00:24:59.860 "is_configured": true, 00:24:59.860 "data_offset": 256, 00:24:59.860 "data_size": 7936 00:24:59.860 } 00:24:59.860 ] 00:24:59.860 } 00:24:59.860 } 00:24:59.860 }' 00:24:59.860 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:00.118 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:25:00.118 BaseBdev2' 00:25:00.118 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:00.118 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:00.118 12:01:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:00.118 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:00.118 "name": "BaseBdev1", 00:25:00.118 "aliases": [ 00:25:00.118 "caf00f86-cd7c-49bb-a840-079837a154cb" 00:25:00.118 ], 00:25:00.118 "product_name": "Malloc disk", 00:25:00.118 "block_size": 4128, 00:25:00.118 "num_blocks": 8192, 00:25:00.118 "uuid": "caf00f86-cd7c-49bb-a840-079837a154cb", 00:25:00.118 "md_size": 32, 00:25:00.118 "md_interleave": true, 00:25:00.118 "dif_type": 0, 00:25:00.118 "assigned_rate_limits": { 00:25:00.118 "rw_ios_per_sec": 0, 00:25:00.118 "rw_mbytes_per_sec": 0, 00:25:00.118 "r_mbytes_per_sec": 0, 00:25:00.118 "w_mbytes_per_sec": 0 00:25:00.118 }, 00:25:00.118 "claimed": true, 00:25:00.118 "claim_type": "exclusive_write", 00:25:00.118 "zoned": false, 00:25:00.118 "supported_io_types": { 00:25:00.118 "read": true, 00:25:00.118 "write": true, 00:25:00.118 "unmap": true, 00:25:00.118 "write_zeroes": true, 00:25:00.118 "flush": true, 00:25:00.118 "reset": true, 00:25:00.118 "compare": false, 00:25:00.118 "compare_and_write": false, 00:25:00.118 "abort": true, 00:25:00.118 "nvme_admin": false, 00:25:00.118 "nvme_io": false 00:25:00.118 }, 00:25:00.118 "memory_domains": [ 00:25:00.118 { 00:25:00.118 "dma_device_id": "system", 00:25:00.118 "dma_device_type": 1 00:25:00.118 }, 00:25:00.118 { 00:25:00.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:00.118 "dma_device_type": 2 00:25:00.118 } 00:25:00.118 ], 00:25:00.118 "driver_specific": {} 00:25:00.118 }' 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:00.376 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:00.637 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:00.895 "name": "BaseBdev2", 00:25:00.895 "aliases": [ 00:25:00.895 "1b682368-a5a5-4de8-aa19-58d001501f29" 00:25:00.895 ], 00:25:00.895 "product_name": "Malloc disk", 00:25:00.895 "block_size": 4128, 00:25:00.895 "num_blocks": 8192, 00:25:00.895 "uuid": "1b682368-a5a5-4de8-aa19-58d001501f29", 00:25:00.895 "md_size": 32, 00:25:00.895 "md_interleave": true, 00:25:00.895 "dif_type": 0, 00:25:00.895 "assigned_rate_limits": { 00:25:00.895 "rw_ios_per_sec": 0, 00:25:00.895 "rw_mbytes_per_sec": 0, 00:25:00.895 "r_mbytes_per_sec": 0, 00:25:00.895 "w_mbytes_per_sec": 0 00:25:00.895 }, 00:25:00.895 "claimed": true, 00:25:00.895 "claim_type": "exclusive_write", 00:25:00.895 "zoned": false, 00:25:00.895 "supported_io_types": { 00:25:00.895 "read": true, 00:25:00.895 "write": true, 00:25:00.895 "unmap": true, 00:25:00.895 "write_zeroes": true, 00:25:00.895 "flush": true, 00:25:00.895 "reset": true, 00:25:00.895 "compare": false, 00:25:00.895 "compare_and_write": false, 00:25:00.895 "abort": true, 00:25:00.895 "nvme_admin": false, 00:25:00.895 "nvme_io": false 00:25:00.895 }, 00:25:00.895 "memory_domains": [ 00:25:00.895 { 00:25:00.895 "dma_device_id": "system", 00:25:00.895 "dma_device_type": 1 00:25:00.895 }, 00:25:00.895 { 00:25:00.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:00.895 "dma_device_type": 2 00:25:00.895 } 00:25:00.895 ], 00:25:00.895 "driver_specific": {} 00:25:00.895 }' 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:00.895 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:01.153 12:01:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:01.153 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:01.153 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:01.153 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:01.154 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:01.412 [2024-05-14 12:01:28.302230] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # local expected_state 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:01.412 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.670 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:01.670 "name": "Existed_Raid", 00:25:01.670 "uuid": "cfec8491-9ea5-4f58-b8b5-3841d612f2f8", 00:25:01.670 "strip_size_kb": 0, 00:25:01.670 "state": "online", 00:25:01.670 "raid_level": "raid1", 00:25:01.670 "superblock": true, 00:25:01.670 "num_base_bdevs": 2, 00:25:01.670 "num_base_bdevs_discovered": 1, 00:25:01.670 "num_base_bdevs_operational": 1, 00:25:01.670 "base_bdevs_list": [ 00:25:01.670 { 00:25:01.670 "name": null, 00:25:01.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.670 "is_configured": false, 00:25:01.670 "data_offset": 256, 00:25:01.670 "data_size": 7936 00:25:01.670 }, 00:25:01.670 { 00:25:01.670 "name": "BaseBdev2", 00:25:01.670 "uuid": "1b682368-a5a5-4de8-aa19-58d001501f29", 00:25:01.670 "is_configured": true, 00:25:01.670 "data_offset": 256, 00:25:01.670 "data_size": 7936 00:25:01.670 } 00:25:01.670 ] 00:25:01.670 }' 00:25:01.670 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:01.670 12:01:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:02.238 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:25:02.238 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:25:02.238 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.238 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:25:02.495 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:25:02.495 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:02.496 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:02.496 [2024-05-14 12:01:29.550625] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:02.496 [2024-05-14 12:01:29.550696] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:02.496 [2024-05-14 12:01:29.561979] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:02.496 [2024-05-14 12:01:29.562047] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:02.496 [2024-05-14 12:01:29.562061] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1375ff0 name Existed_Raid, state offline 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@342 -- # killprocess 1799123 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 1799123 ']' 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 1799123 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:25:02.754 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1799123 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1799123' 00:25:03.012 killing process with pid 1799123 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 1799123 00:25:03.012 [2024-05-14 12:01:29.882212] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:03.012 12:01:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 1799123 00:25:03.012 [2024-05-14 12:01:29.883091] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:03.270 12:01:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@344 -- # return 0 00:25:03.270 00:25:03.270 real 0m10.391s 00:25:03.270 user 0m18.399s 00:25:03.270 sys 0m1.972s 00:25:03.270 12:01:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:03.270 12:01:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:03.270 ************************************ 00:25:03.270 END TEST raid_state_function_test_sb_md_interleaved 00:25:03.270 ************************************ 00:25:03.270 12:01:30 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:25:03.270 12:01:30 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:25:03.270 12:01:30 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:03.270 12:01:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:03.270 ************************************ 00:25:03.270 START TEST raid_superblock_test_md_interleaved 00:25:03.270 ************************************ 00:25:03.270 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:25:03.270 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:25:03.270 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:25:03.270 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # raid_pid=1800728 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # waitforlisten 1800728 /var/tmp/spdk-raid.sock 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 1800728 ']' 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:03.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:03.271 12:01:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:03.271 [2024-05-14 12:01:30.277072] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:03.271 [2024-05-14 12:01:30.277141] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1800728 ] 00:25:03.529 [2024-05-14 12:01:30.406996] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:03.529 [2024-05-14 12:01:30.505117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:03.529 [2024-05-14 12:01:30.566289] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:03.529 [2024-05-14 12:01:30.566331] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:25:04.462 malloc1 00:25:04.462 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:04.721 [2024-05-14 12:01:31.625485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:04.721 [2024-05-14 12:01:31.625535] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:04.721 [2024-05-14 12:01:31.625557] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215f290 00:25:04.721 [2024-05-14 12:01:31.625569] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:04.721 [2024-05-14 12:01:31.626993] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:04.721 [2024-05-14 12:01:31.627020] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:04.721 pt1 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:04.721 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:25:04.979 malloc2 00:25:04.979 12:01:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:04.979 [2024-05-14 12:01:32.055710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:04.979 [2024-05-14 12:01:32.055754] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:04.979 [2024-05-14 12:01:32.055773] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbb170 00:25:04.979 [2024-05-14 12:01:32.055786] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:04.979 [2024-05-14 12:01:32.057034] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:04.979 [2024-05-14 12:01:32.057062] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:04.979 pt2 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:05.238 [2024-05-14 12:01:32.300381] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:05.238 [2024-05-14 12:01:32.301592] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:05.238 [2024-05-14 12:01:32.301739] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fbc900 00:25:05.238 [2024-05-14 12:01:32.301752] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:05.238 [2024-05-14 12:01:32.301823] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb9ef0 00:25:05.238 [2024-05-14 12:01:32.301907] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fbc900 00:25:05.238 [2024-05-14 12:01:32.301916] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fbc900 00:25:05.238 [2024-05-14 12:01:32.301972] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:05.238 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:05.495 "name": "raid_bdev1", 00:25:05.495 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:05.495 "strip_size_kb": 0, 00:25:05.495 "state": "online", 00:25:05.495 "raid_level": "raid1", 00:25:05.495 "superblock": true, 00:25:05.495 "num_base_bdevs": 2, 00:25:05.495 "num_base_bdevs_discovered": 2, 00:25:05.495 "num_base_bdevs_operational": 2, 00:25:05.495 "base_bdevs_list": [ 00:25:05.495 { 00:25:05.495 "name": "pt1", 00:25:05.495 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:05.495 "is_configured": true, 00:25:05.495 "data_offset": 256, 00:25:05.495 "data_size": 7936 00:25:05.495 }, 00:25:05.495 { 00:25:05.495 "name": "pt2", 00:25:05.495 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:05.495 "is_configured": true, 00:25:05.495 "data_offset": 256, 00:25:05.495 "data_size": 7936 00:25:05.495 } 00:25:05.495 ] 00:25:05.495 }' 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:05.495 12:01:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:25:06.428 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:25:06.429 [2024-05-14 12:01:33.407529] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:25:06.429 "name": "raid_bdev1", 00:25:06.429 "aliases": [ 00:25:06.429 "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b" 00:25:06.429 ], 00:25:06.429 "product_name": "Raid Volume", 00:25:06.429 "block_size": 4128, 00:25:06.429 "num_blocks": 7936, 00:25:06.429 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:06.429 "md_size": 32, 00:25:06.429 "md_interleave": true, 00:25:06.429 "dif_type": 0, 00:25:06.429 "assigned_rate_limits": { 00:25:06.429 "rw_ios_per_sec": 0, 00:25:06.429 "rw_mbytes_per_sec": 0, 00:25:06.429 "r_mbytes_per_sec": 0, 00:25:06.429 "w_mbytes_per_sec": 0 00:25:06.429 }, 00:25:06.429 "claimed": false, 00:25:06.429 "zoned": false, 00:25:06.429 "supported_io_types": { 00:25:06.429 "read": true, 00:25:06.429 "write": true, 00:25:06.429 "unmap": false, 00:25:06.429 "write_zeroes": true, 00:25:06.429 "flush": false, 00:25:06.429 "reset": true, 00:25:06.429 "compare": false, 00:25:06.429 "compare_and_write": false, 00:25:06.429 "abort": false, 00:25:06.429 "nvme_admin": false, 00:25:06.429 "nvme_io": false 00:25:06.429 }, 00:25:06.429 "memory_domains": [ 00:25:06.429 { 00:25:06.429 "dma_device_id": "system", 00:25:06.429 "dma_device_type": 1 00:25:06.429 }, 00:25:06.429 { 00:25:06.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:06.429 "dma_device_type": 2 00:25:06.429 }, 00:25:06.429 { 00:25:06.429 "dma_device_id": "system", 00:25:06.429 "dma_device_type": 1 00:25:06.429 }, 00:25:06.429 { 00:25:06.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:06.429 "dma_device_type": 2 00:25:06.429 } 00:25:06.429 ], 00:25:06.429 "driver_specific": { 00:25:06.429 "raid": { 00:25:06.429 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:06.429 "strip_size_kb": 0, 00:25:06.429 "state": "online", 00:25:06.429 "raid_level": "raid1", 00:25:06.429 "superblock": true, 00:25:06.429 "num_base_bdevs": 2, 00:25:06.429 "num_base_bdevs_discovered": 2, 00:25:06.429 "num_base_bdevs_operational": 2, 00:25:06.429 "base_bdevs_list": [ 00:25:06.429 { 00:25:06.429 "name": "pt1", 00:25:06.429 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:06.429 "is_configured": true, 00:25:06.429 "data_offset": 256, 00:25:06.429 "data_size": 7936 00:25:06.429 }, 00:25:06.429 { 00:25:06.429 "name": "pt2", 00:25:06.429 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:06.429 "is_configured": true, 00:25:06.429 "data_offset": 256, 00:25:06.429 "data_size": 7936 00:25:06.429 } 00:25:06.429 ] 00:25:06.429 } 00:25:06.429 } 00:25:06.429 }' 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:25:06.429 pt2' 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:06.429 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:06.688 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:06.688 "name": "pt1", 00:25:06.688 "aliases": [ 00:25:06.688 "1d7d5869-e78e-5617-bd76-87f00f537928" 00:25:06.688 ], 00:25:06.688 "product_name": "passthru", 00:25:06.688 "block_size": 4128, 00:25:06.688 "num_blocks": 8192, 00:25:06.688 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:06.688 "md_size": 32, 00:25:06.688 "md_interleave": true, 00:25:06.688 "dif_type": 0, 00:25:06.688 "assigned_rate_limits": { 00:25:06.688 "rw_ios_per_sec": 0, 00:25:06.688 "rw_mbytes_per_sec": 0, 00:25:06.688 "r_mbytes_per_sec": 0, 00:25:06.688 "w_mbytes_per_sec": 0 00:25:06.688 }, 00:25:06.688 "claimed": true, 00:25:06.688 "claim_type": "exclusive_write", 00:25:06.688 "zoned": false, 00:25:06.688 "supported_io_types": { 00:25:06.688 "read": true, 00:25:06.688 "write": true, 00:25:06.688 "unmap": true, 00:25:06.688 "write_zeroes": true, 00:25:06.688 "flush": true, 00:25:06.688 "reset": true, 00:25:06.688 "compare": false, 00:25:06.688 "compare_and_write": false, 00:25:06.688 "abort": true, 00:25:06.688 "nvme_admin": false, 00:25:06.688 "nvme_io": false 00:25:06.688 }, 00:25:06.688 "memory_domains": [ 00:25:06.688 { 00:25:06.688 "dma_device_id": "system", 00:25:06.688 "dma_device_type": 1 00:25:06.688 }, 00:25:06.688 { 00:25:06.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:06.688 "dma_device_type": 2 00:25:06.688 } 00:25:06.688 ], 00:25:06.688 "driver_specific": { 00:25:06.688 "passthru": { 00:25:06.688 "name": "pt1", 00:25:06.688 "base_bdev_name": "malloc1" 00:25:06.688 } 00:25:06.688 } 00:25:06.688 }' 00:25:06.688 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:06.688 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:06.945 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:06.946 12:01:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:07.204 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:07.204 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:07.204 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:07.204 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:07.204 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:07.462 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:07.462 "name": "pt2", 00:25:07.462 "aliases": [ 00:25:07.462 "70548047-75b2-5829-95f3-8711cc5abcc1" 00:25:07.462 ], 00:25:07.462 "product_name": "passthru", 00:25:07.462 "block_size": 4128, 00:25:07.462 "num_blocks": 8192, 00:25:07.462 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:07.462 "md_size": 32, 00:25:07.462 "md_interleave": true, 00:25:07.462 "dif_type": 0, 00:25:07.462 "assigned_rate_limits": { 00:25:07.462 "rw_ios_per_sec": 0, 00:25:07.462 "rw_mbytes_per_sec": 0, 00:25:07.462 "r_mbytes_per_sec": 0, 00:25:07.462 "w_mbytes_per_sec": 0 00:25:07.462 }, 00:25:07.462 "claimed": true, 00:25:07.462 "claim_type": "exclusive_write", 00:25:07.462 "zoned": false, 00:25:07.462 "supported_io_types": { 00:25:07.462 "read": true, 00:25:07.462 "write": true, 00:25:07.462 "unmap": true, 00:25:07.462 "write_zeroes": true, 00:25:07.462 "flush": true, 00:25:07.462 "reset": true, 00:25:07.462 "compare": false, 00:25:07.462 "compare_and_write": false, 00:25:07.462 "abort": true, 00:25:07.462 "nvme_admin": false, 00:25:07.462 "nvme_io": false 00:25:07.462 }, 00:25:07.462 "memory_domains": [ 00:25:07.462 { 00:25:07.462 "dma_device_id": "system", 00:25:07.462 "dma_device_type": 1 00:25:07.462 }, 00:25:07.462 { 00:25:07.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:07.463 "dma_device_type": 2 00:25:07.463 } 00:25:07.463 ], 00:25:07.463 "driver_specific": { 00:25:07.463 "passthru": { 00:25:07.463 "name": "pt2", 00:25:07.463 "base_bdev_name": "malloc2" 00:25:07.463 } 00:25:07.463 } 00:25:07.463 }' 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:07.463 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:25:07.721 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:07.978 [2024-05-14 12:01:34.915506] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:07.978 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=d54b4fc3-2de9-40fd-8bfa-b947a4c3629b 00:25:07.978 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # '[' -z d54b4fc3-2de9-40fd-8bfa-b947a4c3629b ']' 00:25:07.978 12:01:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:08.235 [2024-05-14 12:01:35.155904] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:08.235 [2024-05-14 12:01:35.155928] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:08.235 [2024-05-14 12:01:35.155991] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:08.235 [2024-05-14 12:01:35.156050] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:08.235 [2024-05-14 12:01:35.156062] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fbc900 name raid_bdev1, state offline 00:25:08.235 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.235 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:25:08.494 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:25:08.494 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:25:08.494 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:25:08.494 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:08.753 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:25:08.753 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:09.012 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:09.012 12:01:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:09.269 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:09.526 [2024-05-14 12:01:36.375089] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:09.526 [2024-05-14 12:01:36.376468] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:09.527 [2024-05-14 12:01:36.376531] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:09.527 [2024-05-14 12:01:36.376572] bdev_raid.c:3030:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:09.527 [2024-05-14 12:01:36.376591] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:09.527 [2024-05-14 12:01:36.376601] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215fa30 name raid_bdev1, state configuring 00:25:09.527 request: 00:25:09.527 { 00:25:09.527 "name": "raid_bdev1", 00:25:09.527 "raid_level": "raid1", 00:25:09.527 "base_bdevs": [ 00:25:09.527 "malloc1", 00:25:09.527 "malloc2" 00:25:09.527 ], 00:25:09.527 "superblock": false, 00:25:09.527 "method": "bdev_raid_create", 00:25:09.527 "req_id": 1 00:25:09.527 } 00:25:09.527 Got JSON-RPC error response 00:25:09.527 response: 00:25:09.527 { 00:25:09.527 "code": -17, 00:25:09.527 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:09.527 } 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.527 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:25:09.785 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:25:09.785 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:25:09.785 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:09.785 [2024-05-14 12:01:36.864315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:09.785 [2024-05-14 12:01:36.864366] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.785 [2024-05-14 12:01:36.864389] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2155830 00:25:09.785 [2024-05-14 12:01:36.864413] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.785 [2024-05-14 12:01:36.865872] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.785 [2024-05-14 12:01:36.865903] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:09.785 [2024-05-14 12:01:36.865957] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt1 00:25:09.785 [2024-05-14 12:01:36.865987] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:09.785 pt1 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.045 12:01:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.045 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:10.045 "name": "raid_bdev1", 00:25:10.045 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:10.045 "strip_size_kb": 0, 00:25:10.045 "state": "configuring", 00:25:10.045 "raid_level": "raid1", 00:25:10.045 "superblock": true, 00:25:10.045 "num_base_bdevs": 2, 00:25:10.045 "num_base_bdevs_discovered": 1, 00:25:10.045 "num_base_bdevs_operational": 2, 00:25:10.045 "base_bdevs_list": [ 00:25:10.045 { 00:25:10.045 "name": "pt1", 00:25:10.045 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:10.045 "is_configured": true, 00:25:10.045 "data_offset": 256, 00:25:10.045 "data_size": 7936 00:25:10.045 }, 00:25:10.045 { 00:25:10.045 "name": null, 00:25:10.045 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:10.045 "is_configured": false, 00:25:10.045 "data_offset": 256, 00:25:10.045 "data_size": 7936 00:25:10.045 } 00:25:10.045 ] 00:25:10.045 }' 00:25:10.045 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:10.045 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:10.977 [2024-05-14 12:01:37.943158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:10.977 [2024-05-14 12:01:37.943211] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:10.977 [2024-05-14 12:01:37.943232] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbe190 00:25:10.977 [2024-05-14 12:01:37.943244] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:10.977 [2024-05-14 12:01:37.943426] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:10.977 [2024-05-14 12:01:37.943442] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:10.977 [2024-05-14 12:01:37.943486] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:25:10.977 [2024-05-14 12:01:37.943504] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:10.977 [2024-05-14 12:01:37.943587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fbe910 00:25:10.977 [2024-05-14 12:01:37.943597] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:10.977 [2024-05-14 12:01:37.943653] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb9f70 00:25:10.977 [2024-05-14 12:01:37.943727] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fbe910 00:25:10.977 [2024-05-14 12:01:37.943736] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fbe910 00:25:10.977 [2024-05-14 12:01:37.943793] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.977 pt2 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.977 12:01:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.235 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:11.235 "name": "raid_bdev1", 00:25:11.235 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:11.235 "strip_size_kb": 0, 00:25:11.235 "state": "online", 00:25:11.235 "raid_level": "raid1", 00:25:11.235 "superblock": true, 00:25:11.235 "num_base_bdevs": 2, 00:25:11.235 "num_base_bdevs_discovered": 2, 00:25:11.235 "num_base_bdevs_operational": 2, 00:25:11.235 "base_bdevs_list": [ 00:25:11.235 { 00:25:11.235 "name": "pt1", 00:25:11.235 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:11.235 "is_configured": true, 00:25:11.235 "data_offset": 256, 00:25:11.235 "data_size": 7936 00:25:11.235 }, 00:25:11.235 { 00:25:11.235 "name": "pt2", 00:25:11.235 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:11.235 "is_configured": true, 00:25:11.235 "data_offset": 256, 00:25:11.235 "data_size": 7936 00:25:11.235 } 00:25:11.235 ] 00:25:11.235 }' 00:25:11.235 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:11.235 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:11.801 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:25:12.058 [2024-05-14 12:01:38.958085] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:12.058 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:25:12.058 "name": "raid_bdev1", 00:25:12.058 "aliases": [ 00:25:12.058 "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b" 00:25:12.058 ], 00:25:12.058 "product_name": "Raid Volume", 00:25:12.058 "block_size": 4128, 00:25:12.058 "num_blocks": 7936, 00:25:12.058 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:12.058 "md_size": 32, 00:25:12.058 "md_interleave": true, 00:25:12.058 "dif_type": 0, 00:25:12.058 "assigned_rate_limits": { 00:25:12.058 "rw_ios_per_sec": 0, 00:25:12.058 "rw_mbytes_per_sec": 0, 00:25:12.058 "r_mbytes_per_sec": 0, 00:25:12.058 "w_mbytes_per_sec": 0 00:25:12.058 }, 00:25:12.058 "claimed": false, 00:25:12.058 "zoned": false, 00:25:12.058 "supported_io_types": { 00:25:12.058 "read": true, 00:25:12.059 "write": true, 00:25:12.059 "unmap": false, 00:25:12.059 "write_zeroes": true, 00:25:12.059 "flush": false, 00:25:12.059 "reset": true, 00:25:12.059 "compare": false, 00:25:12.059 "compare_and_write": false, 00:25:12.059 "abort": false, 00:25:12.059 "nvme_admin": false, 00:25:12.059 "nvme_io": false 00:25:12.059 }, 00:25:12.059 "memory_domains": [ 00:25:12.059 { 00:25:12.059 "dma_device_id": "system", 00:25:12.059 "dma_device_type": 1 00:25:12.059 }, 00:25:12.059 { 00:25:12.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.059 "dma_device_type": 2 00:25:12.059 }, 00:25:12.059 { 00:25:12.059 "dma_device_id": "system", 00:25:12.059 "dma_device_type": 1 00:25:12.059 }, 00:25:12.059 { 00:25:12.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.059 "dma_device_type": 2 00:25:12.059 } 00:25:12.059 ], 00:25:12.059 "driver_specific": { 00:25:12.059 "raid": { 00:25:12.059 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:12.059 "strip_size_kb": 0, 00:25:12.059 "state": "online", 00:25:12.059 "raid_level": "raid1", 00:25:12.059 "superblock": true, 00:25:12.059 "num_base_bdevs": 2, 00:25:12.059 "num_base_bdevs_discovered": 2, 00:25:12.059 "num_base_bdevs_operational": 2, 00:25:12.059 "base_bdevs_list": [ 00:25:12.059 { 00:25:12.059 "name": "pt1", 00:25:12.059 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:12.059 "is_configured": true, 00:25:12.059 "data_offset": 256, 00:25:12.059 "data_size": 7936 00:25:12.059 }, 00:25:12.059 { 00:25:12.059 "name": "pt2", 00:25:12.059 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:12.059 "is_configured": true, 00:25:12.059 "data_offset": 256, 00:25:12.059 "data_size": 7936 00:25:12.059 } 00:25:12.059 ] 00:25:12.059 } 00:25:12.059 } 00:25:12.059 }' 00:25:12.059 12:01:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:12.059 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:25:12.059 pt2' 00:25:12.059 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:12.059 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:12.059 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:12.317 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:12.317 "name": "pt1", 00:25:12.317 "aliases": [ 00:25:12.317 "1d7d5869-e78e-5617-bd76-87f00f537928" 00:25:12.317 ], 00:25:12.317 "product_name": "passthru", 00:25:12.317 "block_size": 4128, 00:25:12.317 "num_blocks": 8192, 00:25:12.317 "uuid": "1d7d5869-e78e-5617-bd76-87f00f537928", 00:25:12.317 "md_size": 32, 00:25:12.317 "md_interleave": true, 00:25:12.317 "dif_type": 0, 00:25:12.317 "assigned_rate_limits": { 00:25:12.317 "rw_ios_per_sec": 0, 00:25:12.317 "rw_mbytes_per_sec": 0, 00:25:12.317 "r_mbytes_per_sec": 0, 00:25:12.317 "w_mbytes_per_sec": 0 00:25:12.317 }, 00:25:12.317 "claimed": true, 00:25:12.317 "claim_type": "exclusive_write", 00:25:12.317 "zoned": false, 00:25:12.317 "supported_io_types": { 00:25:12.317 "read": true, 00:25:12.317 "write": true, 00:25:12.317 "unmap": true, 00:25:12.317 "write_zeroes": true, 00:25:12.317 "flush": true, 00:25:12.317 "reset": true, 00:25:12.317 "compare": false, 00:25:12.317 "compare_and_write": false, 00:25:12.317 "abort": true, 00:25:12.317 "nvme_admin": false, 00:25:12.317 "nvme_io": false 00:25:12.317 }, 00:25:12.317 "memory_domains": [ 00:25:12.317 { 00:25:12.317 "dma_device_id": "system", 00:25:12.317 "dma_device_type": 1 00:25:12.317 }, 00:25:12.317 { 00:25:12.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.317 "dma_device_type": 2 00:25:12.317 } 00:25:12.317 ], 00:25:12.317 "driver_specific": { 00:25:12.317 "passthru": { 00:25:12.317 "name": "pt1", 00:25:12.317 "base_bdev_name": "malloc1" 00:25:12.317 } 00:25:12.317 } 00:25:12.317 }' 00:25:12.317 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:12.317 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:12.317 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:12.317 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:12.575 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:25:12.836 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:25:12.836 "name": "pt2", 00:25:12.836 "aliases": [ 00:25:12.836 "70548047-75b2-5829-95f3-8711cc5abcc1" 00:25:12.836 ], 00:25:12.836 "product_name": "passthru", 00:25:12.836 "block_size": 4128, 00:25:12.836 "num_blocks": 8192, 00:25:12.836 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:12.836 "md_size": 32, 00:25:12.836 "md_interleave": true, 00:25:12.836 "dif_type": 0, 00:25:12.836 "assigned_rate_limits": { 00:25:12.836 "rw_ios_per_sec": 0, 00:25:12.836 "rw_mbytes_per_sec": 0, 00:25:12.836 "r_mbytes_per_sec": 0, 00:25:12.836 "w_mbytes_per_sec": 0 00:25:12.836 }, 00:25:12.836 "claimed": true, 00:25:12.836 "claim_type": "exclusive_write", 00:25:12.836 "zoned": false, 00:25:12.836 "supported_io_types": { 00:25:12.836 "read": true, 00:25:12.836 "write": true, 00:25:12.836 "unmap": true, 00:25:12.836 "write_zeroes": true, 00:25:12.836 "flush": true, 00:25:12.836 "reset": true, 00:25:12.836 "compare": false, 00:25:12.836 "compare_and_write": false, 00:25:12.836 "abort": true, 00:25:12.836 "nvme_admin": false, 00:25:12.836 "nvme_io": false 00:25:12.836 }, 00:25:12.836 "memory_domains": [ 00:25:12.836 { 00:25:12.836 "dma_device_id": "system", 00:25:12.836 "dma_device_type": 1 00:25:12.836 }, 00:25:12.836 { 00:25:12.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.836 "dma_device_type": 2 00:25:12.836 } 00:25:12.836 ], 00:25:12.836 "driver_specific": { 00:25:12.836 "passthru": { 00:25:12.836 "name": "pt2", 00:25:12.836 "base_bdev_name": "malloc2" 00:25:12.836 } 00:25:12.836 } 00:25:12.836 }' 00:25:12.836 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:12.836 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:25:13.173 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:25:13.173 12:01:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:13.173 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:25:13.430 [2024-05-14 12:01:40.450057] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:13.430 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # '[' d54b4fc3-2de9-40fd-8bfa-b947a4c3629b '!=' d54b4fc3-2de9-40fd-8bfa-b947a4c3629b ']' 00:25:13.430 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:25:13.430 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:25:13.430 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:25:13.430 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:13.687 [2024-05-14 12:01:40.698506] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:13.687 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:13.688 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:13.688 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:13.688 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.688 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.946 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:13.946 "name": "raid_bdev1", 00:25:13.946 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:13.946 "strip_size_kb": 0, 00:25:13.946 "state": "online", 00:25:13.946 "raid_level": "raid1", 00:25:13.946 "superblock": true, 00:25:13.946 "num_base_bdevs": 2, 00:25:13.946 "num_base_bdevs_discovered": 1, 00:25:13.946 "num_base_bdevs_operational": 1, 00:25:13.946 "base_bdevs_list": [ 00:25:13.946 { 00:25:13.946 "name": null, 00:25:13.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.946 "is_configured": false, 00:25:13.946 "data_offset": 256, 00:25:13.946 "data_size": 7936 00:25:13.946 }, 00:25:13.946 { 00:25:13.946 "name": "pt2", 00:25:13.946 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:13.946 "is_configured": true, 00:25:13.946 "data_offset": 256, 00:25:13.946 "data_size": 7936 00:25:13.946 } 00:25:13.946 ] 00:25:13.946 }' 00:25:13.946 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:13.946 12:01:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:14.512 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:14.768 [2024-05-14 12:01:41.705137] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:14.768 [2024-05-14 12:01:41.705167] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:14.769 [2024-05-14 12:01:41.705228] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:14.769 [2024-05-14 12:01:41.705278] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:14.769 [2024-05-14 12:01:41.705291] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fbe910 name raid_bdev1, state offline 00:25:14.769 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:25:14.769 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.024 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:25:15.024 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:25:15.024 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:25:15.024 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:25:15.024 12:01:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # i=1 00:25:15.024 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:15.281 [2024-05-14 12:01:42.238518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:15.281 [2024-05-14 12:01:42.238566] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.281 [2024-05-14 12:01:42.238587] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc3520 00:25:15.281 [2024-05-14 12:01:42.238599] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.281 [2024-05-14 12:01:42.240022] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.281 [2024-05-14 12:01:42.240047] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:15.281 [2024-05-14 12:01:42.240093] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev pt2 00:25:15.281 [2024-05-14 12:01:42.240118] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:15.281 [2024-05-14 12:01:42.240185] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fc2190 00:25:15.282 [2024-05-14 12:01:42.240196] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:15.282 [2024-05-14 12:01:42.240252] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215f930 00:25:15.282 [2024-05-14 12:01:42.240322] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fc2190 00:25:15.282 [2024-05-14 12:01:42.240331] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fc2190 00:25:15.282 [2024-05-14 12:01:42.240384] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:15.282 pt2 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.282 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.538 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:15.538 "name": "raid_bdev1", 00:25:15.538 "uuid": "d54b4fc3-2de9-40fd-8bfa-b947a4c3629b", 00:25:15.538 "strip_size_kb": 0, 00:25:15.538 "state": "online", 00:25:15.538 "raid_level": "raid1", 00:25:15.538 "superblock": true, 00:25:15.538 "num_base_bdevs": 2, 00:25:15.538 "num_base_bdevs_discovered": 1, 00:25:15.538 "num_base_bdevs_operational": 1, 00:25:15.538 "base_bdevs_list": [ 00:25:15.538 { 00:25:15.538 "name": null, 00:25:15.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.538 "is_configured": false, 00:25:15.538 "data_offset": 256, 00:25:15.538 "data_size": 7936 00:25:15.538 }, 00:25:15.538 { 00:25:15.538 "name": "pt2", 00:25:15.538 "uuid": "70548047-75b2-5829-95f3-8711cc5abcc1", 00:25:15.538 "is_configured": true, 00:25:15.538 "data_offset": 256, 00:25:15.538 "data_size": 7936 00:25:15.538 } 00:25:15.538 ] 00:25:15.538 }' 00:25:15.538 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:15.538 12:01:42 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:16.104 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # '[' 2 -gt 2 ']' 00:25:16.104 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:16.104 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # jq -r '.[] | .uuid' 00:25:16.362 [2024-05-14 12:01:43.249434] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # '[' d54b4fc3-2de9-40fd-8bfa-b947a4c3629b '!=' d54b4fc3-2de9-40fd-8bfa-b947a4c3629b ']' 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@568 -- # killprocess 1800728 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 1800728 ']' 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 1800728 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1800728 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1800728' 00:25:16.362 killing process with pid 1800728 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@965 -- # kill 1800728 00:25:16.362 [2024-05-14 12:01:43.322873] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:16.362 [2024-05-14 12:01:43.322940] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:16.362 [2024-05-14 12:01:43.322986] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:16.362 [2024-05-14 12:01:43.322999] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fc2190 name raid_bdev1, state offline 00:25:16.362 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@970 -- # wait 1800728 00:25:16.362 [2024-05-14 12:01:43.339805] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:16.621 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # return 0 00:25:16.621 00:25:16.621 real 0m13.329s 00:25:16.621 user 0m23.976s 00:25:16.621 sys 0m2.546s 00:25:16.621 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:16.621 12:01:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:16.621 ************************************ 00:25:16.621 END TEST raid_superblock_test_md_interleaved 00:25:16.621 ************************************ 00:25:16.621 12:01:43 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:25:16.621 12:01:43 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:25:16.621 12:01:43 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:16.621 12:01:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:16.621 ************************************ 00:25:16.621 START TEST raid_rebuild_test_sb_md_interleaved 00:25:16.621 ************************************ 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false false 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_level=raid1 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local num_base_bdevs=2 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local superblock=true 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local background_io=false 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local verify=false 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i = 1 )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # echo BaseBdev1 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # echo BaseBdev2 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i++ )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # (( i <= num_base_bdevs )) 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # local base_bdevs 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # local raid_bdev_name=raid_bdev1 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # local strip_size 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@582 -- # local create_arg 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@583 -- # local raid_bdev_size 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local data_offset 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # '[' raid1 '!=' raid1 ']' 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # strip_size=0 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # '[' true = true ']' 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@598 -- # create_arg+=' -s' 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # raid_pid=1802642 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@603 -- # waitforlisten 1802642 /var/tmp/spdk-raid.sock 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 1802642 ']' 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:16.621 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:16.621 12:01:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:16.621 [2024-05-14 12:01:43.702274] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:16.621 [2024-05-14 12:01:43.702338] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802642 ] 00:25:16.621 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:16.621 Zero copy mechanism will not be used. 00:25:16.880 [2024-05-14 12:01:43.833548] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:16.880 [2024-05-14 12:01:43.941122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:17.137 [2024-05-14 12:01:44.005840] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:17.137 [2024-05-14 12:01:44.005880] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:17.703 12:01:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:17.703 12:01:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:25:17.703 12:01:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:17.703 12:01:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:25:17.961 BaseBdev1_malloc 00:25:17.961 12:01:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:18.220 [2024-05-14 12:01:45.095438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:18.220 [2024-05-14 12:01:45.095489] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.220 [2024-05-14 12:01:45.095513] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d7950 00:25:18.220 [2024-05-14 12:01:45.095526] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.220 [2024-05-14 12:01:45.097070] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.220 [2024-05-14 12:01:45.097098] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:18.220 BaseBdev1 00:25:18.220 12:01:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # for bdev in "${base_bdevs[@]}" 00:25:18.220 12:01:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:25:18.478 BaseBdev2_malloc 00:25:18.478 12:01:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:18.736 [2024-05-14 12:01:45.585822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:18.736 [2024-05-14 12:01:45.585869] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.736 [2024-05-14 12:01:45.585887] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1333830 00:25:18.736 [2024-05-14 12:01:45.585899] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.736 [2024-05-14 12:01:45.587260] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.736 [2024-05-14 12:01:45.587286] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:18.736 BaseBdev2 00:25:18.736 12:01:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:25:18.993 spare_malloc 00:25:18.993 12:01:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:18.993 spare_delay 00:25:19.251 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@614 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:19.251 [2024-05-14 12:01:46.301814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:19.251 [2024-05-14 12:01:46.301866] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.251 [2024-05-14 12:01:46.301887] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1336a80 00:25:19.251 [2024-05-14 12:01:46.301900] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.251 [2024-05-14 12:01:46.303279] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.251 [2024-05-14 12:01:46.303305] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:19.251 spare 00:25:19.251 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:19.509 [2024-05-14 12:01:46.530449] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:19.509 [2024-05-14 12:01:46.531795] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:19.509 [2024-05-14 12:01:46.531967] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x13382a0 00:25:19.509 [2024-05-14 12:01:46.531981] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:19.509 [2024-05-14 12:01:46.532051] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x133a970 00:25:19.509 [2024-05-14 12:01:46.532134] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13382a0 00:25:19.509 [2024-05-14 12:01:46.532144] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13382a0 00:25:19.509 [2024-05-14 12:01:46.532203] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.509 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.767 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:19.767 "name": "raid_bdev1", 00:25:19.767 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:19.768 "strip_size_kb": 0, 00:25:19.768 "state": "online", 00:25:19.768 "raid_level": "raid1", 00:25:19.768 "superblock": true, 00:25:19.768 "num_base_bdevs": 2, 00:25:19.768 "num_base_bdevs_discovered": 2, 00:25:19.768 "num_base_bdevs_operational": 2, 00:25:19.768 "base_bdevs_list": [ 00:25:19.768 { 00:25:19.768 "name": "BaseBdev1", 00:25:19.768 "uuid": "706e6419-9a06-5a02-b0d1-d0bf23ac06d2", 00:25:19.768 "is_configured": true, 00:25:19.768 "data_offset": 256, 00:25:19.768 "data_size": 7936 00:25:19.768 }, 00:25:19.768 { 00:25:19.768 "name": "BaseBdev2", 00:25:19.768 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:19.768 "is_configured": true, 00:25:19.768 "data_offset": 256, 00:25:19.768 "data_size": 7936 00:25:19.768 } 00:25:19.768 ] 00:25:19.768 }' 00:25:19.768 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:19.768 12:01:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:20.333 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:20.333 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # jq -r '.[].num_blocks' 00:25:20.590 [2024-05-14 12:01:47.609494] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:20.590 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # raid_bdev_size=7936 00:25:20.590 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.590 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:20.848 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # data_offset=256 00:25:20.848 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@626 -- # '[' false = true ']' 00:25:20.848 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@629 -- # '[' false = true ']' 00:25:20.848 12:01:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:21.106 [2024-05-14 12:01:48.090537] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@648 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.106 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.364 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:21.364 "name": "raid_bdev1", 00:25:21.364 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:21.364 "strip_size_kb": 0, 00:25:21.364 "state": "online", 00:25:21.364 "raid_level": "raid1", 00:25:21.364 "superblock": true, 00:25:21.364 "num_base_bdevs": 2, 00:25:21.364 "num_base_bdevs_discovered": 1, 00:25:21.364 "num_base_bdevs_operational": 1, 00:25:21.364 "base_bdevs_list": [ 00:25:21.364 { 00:25:21.364 "name": null, 00:25:21.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:21.364 "is_configured": false, 00:25:21.364 "data_offset": 256, 00:25:21.364 "data_size": 7936 00:25:21.364 }, 00:25:21.364 { 00:25:21.364 "name": "BaseBdev2", 00:25:21.364 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:21.364 "is_configured": true, 00:25:21.364 "data_offset": 256, 00:25:21.364 "data_size": 7936 00:25:21.364 } 00:25:21.364 ] 00:25:21.364 }' 00:25:21.364 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:21.364 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:21.931 12:01:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:22.189 [2024-05-14 12:01:49.161409] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:22.189 [2024-05-14 12:01:49.164993] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1333200 00:25:22.189 [2024-05-14 12:01:49.167268] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:22.189 12:01:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # sleep 1 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.119 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.377 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:23.377 "name": "raid_bdev1", 00:25:23.377 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:23.377 "strip_size_kb": 0, 00:25:23.377 "state": "online", 00:25:23.377 "raid_level": "raid1", 00:25:23.377 "superblock": true, 00:25:23.377 "num_base_bdevs": 2, 00:25:23.377 "num_base_bdevs_discovered": 2, 00:25:23.377 "num_base_bdevs_operational": 2, 00:25:23.378 "process": { 00:25:23.378 "type": "rebuild", 00:25:23.378 "target": "spare", 00:25:23.378 "progress": { 00:25:23.378 "blocks": 2816, 00:25:23.378 "percent": 35 00:25:23.378 } 00:25:23.378 }, 00:25:23.378 "base_bdevs_list": [ 00:25:23.378 { 00:25:23.378 "name": "spare", 00:25:23.378 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:23.378 "is_configured": true, 00:25:23.378 "data_offset": 256, 00:25:23.378 "data_size": 7936 00:25:23.378 }, 00:25:23.378 { 00:25:23.378 "name": "BaseBdev2", 00:25:23.378 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:23.378 "is_configured": true, 00:25:23.378 "data_offset": 256, 00:25:23.378 "data_size": 7936 00:25:23.378 } 00:25:23.378 ] 00:25:23.378 }' 00:25:23.378 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:23.378 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.378 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:23.378 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.378 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:23.636 [2024-05-14 12:01:50.655971] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:23.636 [2024-05-14 12:01:50.678966] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:23.636 [2024-05-14 12:01:50.679009] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.636 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.895 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:23.895 "name": "raid_bdev1", 00:25:23.895 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:23.895 "strip_size_kb": 0, 00:25:23.895 "state": "online", 00:25:23.895 "raid_level": "raid1", 00:25:23.895 "superblock": true, 00:25:23.895 "num_base_bdevs": 2, 00:25:23.895 "num_base_bdevs_discovered": 1, 00:25:23.895 "num_base_bdevs_operational": 1, 00:25:23.895 "base_bdevs_list": [ 00:25:23.895 { 00:25:23.895 "name": null, 00:25:23.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.895 "is_configured": false, 00:25:23.895 "data_offset": 256, 00:25:23.895 "data_size": 7936 00:25:23.895 }, 00:25:23.895 { 00:25:23.895 "name": "BaseBdev2", 00:25:23.895 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:23.895 "is_configured": true, 00:25:23.895 "data_offset": 256, 00:25:23.895 "data_size": 7936 00:25:23.895 } 00:25:23.895 ] 00:25:23.895 }' 00:25:23.895 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:23.895 12:01:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.464 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.722 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:24.722 "name": "raid_bdev1", 00:25:24.722 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:24.722 "strip_size_kb": 0, 00:25:24.722 "state": "online", 00:25:24.722 "raid_level": "raid1", 00:25:24.722 "superblock": true, 00:25:24.722 "num_base_bdevs": 2, 00:25:24.722 "num_base_bdevs_discovered": 1, 00:25:24.722 "num_base_bdevs_operational": 1, 00:25:24.722 "base_bdevs_list": [ 00:25:24.722 { 00:25:24.722 "name": null, 00:25:24.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.722 "is_configured": false, 00:25:24.722 "data_offset": 256, 00:25:24.722 "data_size": 7936 00:25:24.722 }, 00:25:24.722 { 00:25:24.722 "name": "BaseBdev2", 00:25:24.722 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:24.722 "is_configured": true, 00:25:24.722 "data_offset": 256, 00:25:24.722 "data_size": 7936 00:25:24.722 } 00:25:24.722 ] 00:25:24.722 }' 00:25:24.722 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:24.981 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:24.981 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:24.981 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:24.981 12:01:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@667 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:25.238 [2024-05-14 12:01:52.090446] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:25.238 [2024-05-14 12:01:52.094547] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1336d50 00:25:25.238 [2024-05-14 12:01:52.096058] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:25.238 12:01:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # sleep 1 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@669 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.173 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:26.431 "name": "raid_bdev1", 00:25:26.431 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:26.431 "strip_size_kb": 0, 00:25:26.431 "state": "online", 00:25:26.431 "raid_level": "raid1", 00:25:26.431 "superblock": true, 00:25:26.431 "num_base_bdevs": 2, 00:25:26.431 "num_base_bdevs_discovered": 2, 00:25:26.431 "num_base_bdevs_operational": 2, 00:25:26.431 "process": { 00:25:26.431 "type": "rebuild", 00:25:26.431 "target": "spare", 00:25:26.431 "progress": { 00:25:26.431 "blocks": 3072, 00:25:26.431 "percent": 38 00:25:26.431 } 00:25:26.431 }, 00:25:26.431 "base_bdevs_list": [ 00:25:26.431 { 00:25:26.431 "name": "spare", 00:25:26.431 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:26.431 "is_configured": true, 00:25:26.431 "data_offset": 256, 00:25:26.431 "data_size": 7936 00:25:26.431 }, 00:25:26.431 { 00:25:26.431 "name": "BaseBdev2", 00:25:26.431 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:26.431 "is_configured": true, 00:25:26.431 "data_offset": 256, 00:25:26.431 "data_size": 7936 00:25:26.431 } 00:25:26.431 ] 00:25:26.431 }' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # '[' true = true ']' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # '[' = false ']' 00:25:26.431 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 671: [: =: unary operator expected 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@696 -- # local num_base_bdevs_operational=2 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@698 -- # '[' raid1 = raid1 ']' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@698 -- # '[' 2 -gt 2 ']' 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@711 -- # local timeout=969 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.431 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:26.690 "name": "raid_bdev1", 00:25:26.690 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:26.690 "strip_size_kb": 0, 00:25:26.690 "state": "online", 00:25:26.690 "raid_level": "raid1", 00:25:26.690 "superblock": true, 00:25:26.690 "num_base_bdevs": 2, 00:25:26.690 "num_base_bdevs_discovered": 2, 00:25:26.690 "num_base_bdevs_operational": 2, 00:25:26.690 "process": { 00:25:26.690 "type": "rebuild", 00:25:26.690 "target": "spare", 00:25:26.690 "progress": { 00:25:26.690 "blocks": 3840, 00:25:26.690 "percent": 48 00:25:26.690 } 00:25:26.690 }, 00:25:26.690 "base_bdevs_list": [ 00:25:26.690 { 00:25:26.690 "name": "spare", 00:25:26.690 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:26.690 "is_configured": true, 00:25:26.690 "data_offset": 256, 00:25:26.690 "data_size": 7936 00:25:26.690 }, 00:25:26.690 { 00:25:26.690 "name": "BaseBdev2", 00:25:26.690 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:26.690 "is_configured": true, 00:25:26.690 "data_offset": 256, 00:25:26.690 "data_size": 7936 00:25:26.690 } 00:25:26.690 ] 00:25:26.690 }' 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:26.690 12:01:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@716 -- # sleep 1 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.107 12:01:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:28.107 "name": "raid_bdev1", 00:25:28.107 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:28.107 "strip_size_kb": 0, 00:25:28.107 "state": "online", 00:25:28.107 "raid_level": "raid1", 00:25:28.107 "superblock": true, 00:25:28.107 "num_base_bdevs": 2, 00:25:28.107 "num_base_bdevs_discovered": 2, 00:25:28.107 "num_base_bdevs_operational": 2, 00:25:28.107 "process": { 00:25:28.107 "type": "rebuild", 00:25:28.107 "target": "spare", 00:25:28.107 "progress": { 00:25:28.107 "blocks": 7168, 00:25:28.107 "percent": 90 00:25:28.107 } 00:25:28.107 }, 00:25:28.107 "base_bdevs_list": [ 00:25:28.107 { 00:25:28.107 "name": "spare", 00:25:28.107 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:28.107 "is_configured": true, 00:25:28.107 "data_offset": 256, 00:25:28.107 "data_size": 7936 00:25:28.107 }, 00:25:28.107 { 00:25:28.107 "name": "BaseBdev2", 00:25:28.107 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:28.107 "is_configured": true, 00:25:28.107 "data_offset": 256, 00:25:28.107 "data_size": 7936 00:25:28.107 } 00:25:28.107 ] 00:25:28.107 }' 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.107 12:01:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@716 -- # sleep 1 00:25:28.365 [2024-05-14 12:01:55.220264] bdev_raid.c:2741:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:28.365 [2024-05-14 12:01:55.220327] bdev_raid.c:2458:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:28.365 [2024-05-14 12:01:55.220416] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@712 -- # (( SECONDS < timeout )) 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@713 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:29.297 "name": "raid_bdev1", 00:25:29.297 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:29.297 "strip_size_kb": 0, 00:25:29.297 "state": "online", 00:25:29.297 "raid_level": "raid1", 00:25:29.297 "superblock": true, 00:25:29.297 "num_base_bdevs": 2, 00:25:29.297 "num_base_bdevs_discovered": 2, 00:25:29.297 "num_base_bdevs_operational": 2, 00:25:29.297 "base_bdevs_list": [ 00:25:29.297 { 00:25:29.297 "name": "spare", 00:25:29.297 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:29.297 "is_configured": true, 00:25:29.297 "data_offset": 256, 00:25:29.297 "data_size": 7936 00:25:29.297 }, 00:25:29.297 { 00:25:29.297 "name": "BaseBdev2", 00:25:29.297 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:29.297 "is_configured": true, 00:25:29.297 "data_offset": 256, 00:25:29.297 "data_size": 7936 00:25:29.297 } 00:25:29.297 ] 00:25:29.297 }' 00:25:29.297 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # break 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@720 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.555 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:29.814 "name": "raid_bdev1", 00:25:29.814 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:29.814 "strip_size_kb": 0, 00:25:29.814 "state": "online", 00:25:29.814 "raid_level": "raid1", 00:25:29.814 "superblock": true, 00:25:29.814 "num_base_bdevs": 2, 00:25:29.814 "num_base_bdevs_discovered": 2, 00:25:29.814 "num_base_bdevs_operational": 2, 00:25:29.814 "base_bdevs_list": [ 00:25:29.814 { 00:25:29.814 "name": "spare", 00:25:29.814 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:29.814 "is_configured": true, 00:25:29.814 "data_offset": 256, 00:25:29.814 "data_size": 7936 00:25:29.814 }, 00:25:29.814 { 00:25:29.814 "name": "BaseBdev2", 00:25:29.814 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:29.814 "is_configured": true, 00:25:29.814 "data_offset": 256, 00:25:29.814 "data_size": 7936 00:25:29.814 } 00:25:29.814 ] 00:25:29.814 }' 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.814 12:01:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.072 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:30.072 "name": "raid_bdev1", 00:25:30.072 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:30.072 "strip_size_kb": 0, 00:25:30.072 "state": "online", 00:25:30.072 "raid_level": "raid1", 00:25:30.072 "superblock": true, 00:25:30.072 "num_base_bdevs": 2, 00:25:30.072 "num_base_bdevs_discovered": 2, 00:25:30.072 "num_base_bdevs_operational": 2, 00:25:30.072 "base_bdevs_list": [ 00:25:30.072 { 00:25:30.072 "name": "spare", 00:25:30.072 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:30.072 "is_configured": true, 00:25:30.072 "data_offset": 256, 00:25:30.072 "data_size": 7936 00:25:30.072 }, 00:25:30.072 { 00:25:30.072 "name": "BaseBdev2", 00:25:30.072 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:30.072 "is_configured": true, 00:25:30.072 "data_offset": 256, 00:25:30.072 "data_size": 7936 00:25:30.072 } 00:25:30.072 ] 00:25:30.072 }' 00:25:30.072 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:30.072 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:30.638 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:30.895 [2024-05-14 12:01:57.783063] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:30.895 [2024-05-14 12:01:57.783089] bdev_raid.c:1845:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:30.895 [2024-05-14 12:01:57.783148] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:30.895 [2024-05-14 12:01:57.783205] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:30.895 [2024-05-14 12:01:57.783217] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13382a0 name raid_bdev1, state offline 00:25:30.895 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.895 12:01:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # jq length 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@725 -- # [[ 0 == 0 ]] 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@727 -- # '[' false = true ']' 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # '[' true = true ']' 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev1 ']' 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:31.155 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:31.414 [2024-05-14 12:01:58.360557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:31.414 [2024-05-14 12:01:58.360600] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.414 [2024-05-14 12:01:58.360619] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c5020 00:25:31.414 [2024-05-14 12:01:58.360631] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.414 [2024-05-14 12:01:58.362369] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.414 [2024-05-14 12:01:58.362397] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:31.414 [2024-05-14 12:01:58.362457] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:31.414 [2024-05-14 12:01:58.362483] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:31.414 BaseBdev1 00:25:31.414 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # for bdev in "${base_bdevs[@]}" 00:25:31.414 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@751 -- # '[' -z BaseBdev2 ']' 00:25:31.414 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev2 00:25:31.674 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:31.674 [2024-05-14 12:01:58.673379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:31.674 [2024-05-14 12:01:58.673418] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.674 [2024-05-14 12:01:58.673435] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1339dd0 00:25:31.674 [2024-05-14 12:01:58.673447] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.674 [2024-05-14 12:01:58.673593] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.674 [2024-05-14 12:01:58.673608] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:31.674 [2024-05-14 12:01:58.673650] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev2 00:25:31.674 [2024-05-14 12:01:58.673661] bdev_raid.c:3396:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev2 (3) greater than existing raid bdev raid_bdev1 (1) 00:25:31.674 [2024-05-14 12:01:58.673671] bdev_raid.c:2310:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:31.674 [2024-05-14 12:01:58.673687] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x133aa70 name raid_bdev1, state configuring 00:25:31.674 [2024-05-14 12:01:58.673716] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:31.674 BaseBdev2 00:25:31.674 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@757 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:31.934 [2024-05-14 12:01:58.986215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:31.934 [2024-05-14 12:01:58.986255] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.934 [2024-05-14 12:01:58.986274] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1336cb0 00:25:31.934 [2024-05-14 12:01:58.986286] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.934 [2024-05-14 12:01:58.986473] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.934 [2024-05-14 12:01:58.986490] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:31.934 [2024-05-14 12:01:58.986546] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:25:31.934 [2024-05-14 12:01:58.986565] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:31.934 spare 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:31.934 12:01:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.934 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.193 [2024-05-14 12:01:59.086891] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: io device register 0x13398b0 00:25:32.193 [2024-05-14 12:01:59.086909] bdev_raid.c:1696:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:32.193 [2024-05-14 12:01:59.086982] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1337eb0 00:25:32.193 [2024-05-14 12:01:59.087080] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13398b0 00:25:32.193 [2024-05-14 12:01:59.087090] bdev_raid.c:1726:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13398b0 00:25:32.193 [2024-05-14 12:01:59.087159] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.193 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:32.193 "name": "raid_bdev1", 00:25:32.193 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:32.193 "strip_size_kb": 0, 00:25:32.193 "state": "online", 00:25:32.193 "raid_level": "raid1", 00:25:32.193 "superblock": true, 00:25:32.193 "num_base_bdevs": 2, 00:25:32.193 "num_base_bdevs_discovered": 2, 00:25:32.193 "num_base_bdevs_operational": 2, 00:25:32.193 "base_bdevs_list": [ 00:25:32.193 { 00:25:32.193 "name": "spare", 00:25:32.193 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:32.193 "is_configured": true, 00:25:32.193 "data_offset": 256, 00:25:32.193 "data_size": 7936 00:25:32.193 }, 00:25:32.193 { 00:25:32.193 "name": "BaseBdev2", 00:25:32.193 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:32.193 "is_configured": true, 00:25:32.193 "data_offset": 256, 00:25:32.193 "data_size": 7936 00:25:32.193 } 00:25:32.193 ] 00:25:32.193 }' 00:25:32.193 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:32.193 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.761 12:01:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.018 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:33.018 "name": "raid_bdev1", 00:25:33.018 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:33.018 "strip_size_kb": 0, 00:25:33.018 "state": "online", 00:25:33.018 "raid_level": "raid1", 00:25:33.018 "superblock": true, 00:25:33.018 "num_base_bdevs": 2, 00:25:33.018 "num_base_bdevs_discovered": 2, 00:25:33.018 "num_base_bdevs_operational": 2, 00:25:33.018 "base_bdevs_list": [ 00:25:33.018 { 00:25:33.018 "name": "spare", 00:25:33.018 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:33.018 "is_configured": true, 00:25:33.018 "data_offset": 256, 00:25:33.018 "data_size": 7936 00:25:33.018 }, 00:25:33.018 { 00:25:33.018 "name": "BaseBdev2", 00:25:33.018 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:33.018 "is_configured": true, 00:25:33.018 "data_offset": 256, 00:25:33.018 "data_size": 7936 00:25:33.018 } 00:25:33.018 ] 00:25:33.018 }' 00:25:33.018 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:33.275 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:33.275 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:33.275 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:33.275 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.275 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:33.533 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.533 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:33.792 [2024-05-14 12:02:00.626693] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.792 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.050 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:34.050 "name": "raid_bdev1", 00:25:34.050 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:34.050 "strip_size_kb": 0, 00:25:34.050 "state": "online", 00:25:34.050 "raid_level": "raid1", 00:25:34.050 "superblock": true, 00:25:34.050 "num_base_bdevs": 2, 00:25:34.050 "num_base_bdevs_discovered": 1, 00:25:34.050 "num_base_bdevs_operational": 1, 00:25:34.050 "base_bdevs_list": [ 00:25:34.050 { 00:25:34.050 "name": null, 00:25:34.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.050 "is_configured": false, 00:25:34.050 "data_offset": 256, 00:25:34.050 "data_size": 7936 00:25:34.050 }, 00:25:34.050 { 00:25:34.050 "name": "BaseBdev2", 00:25:34.050 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:34.050 "is_configured": true, 00:25:34.050 "data_offset": 256, 00:25:34.050 "data_size": 7936 00:25:34.050 } 00:25:34.050 ] 00:25:34.050 }' 00:25:34.050 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:34.050 12:02:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:34.617 12:02:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:34.876 [2024-05-14 12:02:01.717595] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.876 [2024-05-14 12:02:01.717742] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:34.876 [2024-05-14 12:02:01.717758] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:34.876 [2024-05-14 12:02:01.717786] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.876 [2024-05-14 12:02:01.721232] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c4a30 00:25:34.876 [2024-05-14 12:02:01.723528] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:34.876 12:02:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # sleep 1 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.813 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.073 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:36.073 "name": "raid_bdev1", 00:25:36.073 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:36.073 "strip_size_kb": 0, 00:25:36.073 "state": "online", 00:25:36.073 "raid_level": "raid1", 00:25:36.073 "superblock": true, 00:25:36.073 "num_base_bdevs": 2, 00:25:36.073 "num_base_bdevs_discovered": 2, 00:25:36.073 "num_base_bdevs_operational": 2, 00:25:36.073 "process": { 00:25:36.073 "type": "rebuild", 00:25:36.073 "target": "spare", 00:25:36.073 "progress": { 00:25:36.073 "blocks": 3072, 00:25:36.073 "percent": 38 00:25:36.073 } 00:25:36.073 }, 00:25:36.073 "base_bdevs_list": [ 00:25:36.073 { 00:25:36.073 "name": "spare", 00:25:36.073 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:36.073 "is_configured": true, 00:25:36.073 "data_offset": 256, 00:25:36.073 "data_size": 7936 00:25:36.073 }, 00:25:36.073 { 00:25:36.073 "name": "BaseBdev2", 00:25:36.073 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:36.073 "is_configured": true, 00:25:36.073 "data_offset": 256, 00:25:36.073 "data_size": 7936 00:25:36.073 } 00:25:36.073 ] 00:25:36.073 }' 00:25:36.073 12:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:36.073 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:36.073 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:36.073 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.073 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:36.332 [2024-05-14 12:02:03.286060] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.332 [2024-05-14 12:02:03.336201] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:36.332 [2024-05-14 12:02:03.336245] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.332 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.593 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:36.593 "name": "raid_bdev1", 00:25:36.593 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:36.593 "strip_size_kb": 0, 00:25:36.593 "state": "online", 00:25:36.593 "raid_level": "raid1", 00:25:36.593 "superblock": true, 00:25:36.593 "num_base_bdevs": 2, 00:25:36.593 "num_base_bdevs_discovered": 1, 00:25:36.593 "num_base_bdevs_operational": 1, 00:25:36.593 "base_bdevs_list": [ 00:25:36.593 { 00:25:36.593 "name": null, 00:25:36.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.593 "is_configured": false, 00:25:36.593 "data_offset": 256, 00:25:36.593 "data_size": 7936 00:25:36.593 }, 00:25:36.593 { 00:25:36.593 "name": "BaseBdev2", 00:25:36.593 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:36.593 "is_configured": true, 00:25:36.593 "data_offset": 256, 00:25:36.593 "data_size": 7936 00:25:36.593 } 00:25:36.593 ] 00:25:36.593 }' 00:25:36.593 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:36.593 12:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:37.167 12:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:37.426 [2024-05-14 12:02:04.323112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:37.426 [2024-05-14 12:02:04.323158] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.426 [2024-05-14 12:02:04.323179] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c4940 00:25:37.426 [2024-05-14 12:02:04.323192] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.426 [2024-05-14 12:02:04.323374] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.426 [2024-05-14 12:02:04.323390] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:37.426 [2024-05-14 12:02:04.323458] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev spare 00:25:37.426 [2024-05-14 12:02:04.323470] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:37.426 [2024-05-14 12:02:04.323480] bdev_raid.c:3452:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:37.426 [2024-05-14 12:02:04.323499] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.426 [2024-05-14 12:02:04.326942] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1337eb0 00:25:37.426 [2024-05-14 12:02:04.328446] bdev_raid.c:2776:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:37.426 spare 00:25:37.426 12:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # sleep 1 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.363 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:38.622 "name": "raid_bdev1", 00:25:38.622 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:38.622 "strip_size_kb": 0, 00:25:38.622 "state": "online", 00:25:38.622 "raid_level": "raid1", 00:25:38.622 "superblock": true, 00:25:38.622 "num_base_bdevs": 2, 00:25:38.622 "num_base_bdevs_discovered": 2, 00:25:38.622 "num_base_bdevs_operational": 2, 00:25:38.622 "process": { 00:25:38.622 "type": "rebuild", 00:25:38.622 "target": "spare", 00:25:38.622 "progress": { 00:25:38.622 "blocks": 3072, 00:25:38.622 "percent": 38 00:25:38.622 } 00:25:38.622 }, 00:25:38.622 "base_bdevs_list": [ 00:25:38.622 { 00:25:38.622 "name": "spare", 00:25:38.622 "uuid": "bf7158f3-196c-566f-b35e-e480724c0146", 00:25:38.622 "is_configured": true, 00:25:38.622 "data_offset": 256, 00:25:38.622 "data_size": 7936 00:25:38.622 }, 00:25:38.622 { 00:25:38.622 "name": "BaseBdev2", 00:25:38.622 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:38.622 "is_configured": true, 00:25:38.622 "data_offset": 256, 00:25:38.622 "data_size": 7936 00:25:38.622 } 00:25:38.622 ] 00:25:38.622 }' 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.622 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:38.881 [2024-05-14 12:02:05.926380] bdev_raid.c:2111:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:38.881 [2024-05-14 12:02:05.941202] bdev_raid.c:2467:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:38.881 [2024-05-14 12:02:05.941246] bdev_raid.c: 315:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:39.140 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.141 12:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.141 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:39.141 "name": "raid_bdev1", 00:25:39.141 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:39.141 "strip_size_kb": 0, 00:25:39.141 "state": "online", 00:25:39.141 "raid_level": "raid1", 00:25:39.141 "superblock": true, 00:25:39.141 "num_base_bdevs": 2, 00:25:39.141 "num_base_bdevs_discovered": 1, 00:25:39.141 "num_base_bdevs_operational": 1, 00:25:39.141 "base_bdevs_list": [ 00:25:39.141 { 00:25:39.141 "name": null, 00:25:39.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.141 "is_configured": false, 00:25:39.141 "data_offset": 256, 00:25:39.141 "data_size": 7936 00:25:39.141 }, 00:25:39.141 { 00:25:39.141 "name": "BaseBdev2", 00:25:39.141 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:39.141 "is_configured": true, 00:25:39.141 "data_offset": 256, 00:25:39.141 "data_size": 7936 00:25:39.141 } 00:25:39.141 ] 00:25:39.141 }' 00:25:39.141 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:39.141 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@781 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.078 12:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:40.078 "name": "raid_bdev1", 00:25:40.078 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:40.078 "strip_size_kb": 0, 00:25:40.078 "state": "online", 00:25:40.078 "raid_level": "raid1", 00:25:40.078 "superblock": true, 00:25:40.078 "num_base_bdevs": 2, 00:25:40.078 "num_base_bdevs_discovered": 1, 00:25:40.078 "num_base_bdevs_operational": 1, 00:25:40.078 "base_bdevs_list": [ 00:25:40.078 { 00:25:40.078 "name": null, 00:25:40.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.078 "is_configured": false, 00:25:40.078 "data_offset": 256, 00:25:40.078 "data_size": 7936 00:25:40.078 }, 00:25:40.078 { 00:25:40.078 "name": "BaseBdev2", 00:25:40.078 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:40.078 "is_configured": true, 00:25:40.078 "data_offset": 256, 00:25:40.078 "data_size": 7936 00:25:40.078 } 00:25:40.078 ] 00:25:40.078 }' 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:40.078 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:40.338 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@785 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:40.597 [2024-05-14 12:02:07.593429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:40.597 [2024-05-14 12:02:07.593475] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.597 [2024-05-14 12:02:07.593498] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1336810 00:25:40.597 [2024-05-14 12:02:07.593511] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.597 [2024-05-14 12:02:07.593672] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.597 [2024-05-14 12:02:07.593688] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:40.597 [2024-05-14 12:02:07.593734] bdev_raid.c:3528:raid_bdev_examine_load_sb_cb: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:40.597 [2024-05-14 12:02:07.593745] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:40.597 [2024-05-14 12:02:07.593756] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:40.597 BaseBdev1 00:25:40.597 12:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@786 -- # sleep 1 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:41.532 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:41.791 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.791 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.791 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:41.791 "name": "raid_bdev1", 00:25:41.791 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:41.791 "strip_size_kb": 0, 00:25:41.791 "state": "online", 00:25:41.791 "raid_level": "raid1", 00:25:41.791 "superblock": true, 00:25:41.791 "num_base_bdevs": 2, 00:25:41.791 "num_base_bdevs_discovered": 1, 00:25:41.791 "num_base_bdevs_operational": 1, 00:25:41.791 "base_bdevs_list": [ 00:25:41.791 { 00:25:41.791 "name": null, 00:25:41.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.791 "is_configured": false, 00:25:41.791 "data_offset": 256, 00:25:41.791 "data_size": 7936 00:25:41.791 }, 00:25:41.791 { 00:25:41.791 "name": "BaseBdev2", 00:25:41.791 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:41.791 "is_configured": true, 00:25:41.791 "data_offset": 256, 00:25:41.791 "data_size": 7936 00:25:41.791 } 00:25:41.791 ] 00:25:41.791 }' 00:25:41.791 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:42.098 12:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.365 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.625 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:42.625 "name": "raid_bdev1", 00:25:42.625 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:42.625 "strip_size_kb": 0, 00:25:42.625 "state": "online", 00:25:42.625 "raid_level": "raid1", 00:25:42.625 "superblock": true, 00:25:42.625 "num_base_bdevs": 2, 00:25:42.625 "num_base_bdevs_discovered": 1, 00:25:42.625 "num_base_bdevs_operational": 1, 00:25:42.625 "base_bdevs_list": [ 00:25:42.625 { 00:25:42.625 "name": null, 00:25:42.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.625 "is_configured": false, 00:25:42.625 "data_offset": 256, 00:25:42.625 "data_size": 7936 00:25:42.625 }, 00:25:42.625 { 00:25:42.625 "name": "BaseBdev2", 00:25:42.625 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:42.625 "is_configured": true, 00:25:42.625 "data_offset": 256, 00:25:42.625 "data_size": 7936 00:25:42.625 } 00:25:42.625 ] 00:25:42.625 }' 00:25:42.625 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:42.625 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:42.625 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:42.884 [2024-05-14 12:02:09.943769] bdev_raid.c:3122:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:42.884 [2024-05-14 12:02:09.943900] bdev_raid.c:3411:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:42.884 [2024-05-14 12:02:09.943920] bdev_raid.c:3430:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:42.884 request: 00:25:42.884 { 00:25:42.884 "raid_bdev": "raid_bdev1", 00:25:42.884 "base_bdev": "BaseBdev1", 00:25:42.884 "method": "bdev_raid_add_base_bdev", 00:25:42.884 "req_id": 1 00:25:42.884 } 00:25:42.884 Got JSON-RPC error response 00:25:42.884 response: 00:25:42.884 { 00:25:42.884 "code": -22, 00:25:42.884 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:42.884 } 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:42.884 12:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # sleep 1 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.284 12:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.284 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:25:44.284 "name": "raid_bdev1", 00:25:44.284 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:44.284 "strip_size_kb": 0, 00:25:44.284 "state": "online", 00:25:44.284 "raid_level": "raid1", 00:25:44.284 "superblock": true, 00:25:44.284 "num_base_bdevs": 2, 00:25:44.284 "num_base_bdevs_discovered": 1, 00:25:44.284 "num_base_bdevs_operational": 1, 00:25:44.284 "base_bdevs_list": [ 00:25:44.284 { 00:25:44.284 "name": null, 00:25:44.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.284 "is_configured": false, 00:25:44.284 "data_offset": 256, 00:25:44.284 "data_size": 7936 00:25:44.284 }, 00:25:44.284 { 00:25:44.284 "name": "BaseBdev2", 00:25:44.284 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:44.284 "is_configured": true, 00:25:44.284 "data_offset": 256, 00:25:44.284 "data_size": 7936 00:25:44.284 } 00:25:44.284 ] 00:25:44.284 }' 00:25:44.285 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:25:44.285 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.855 12:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:25:45.114 "name": "raid_bdev1", 00:25:45.114 "uuid": "4d22af11-ee57-496b-a1cf-9218fb94c10d", 00:25:45.114 "strip_size_kb": 0, 00:25:45.114 "state": "online", 00:25:45.114 "raid_level": "raid1", 00:25:45.114 "superblock": true, 00:25:45.114 "num_base_bdevs": 2, 00:25:45.114 "num_base_bdevs_discovered": 1, 00:25:45.114 "num_base_bdevs_operational": 1, 00:25:45.114 "base_bdevs_list": [ 00:25:45.114 { 00:25:45.114 "name": null, 00:25:45.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.114 "is_configured": false, 00:25:45.114 "data_offset": 256, 00:25:45.114 "data_size": 7936 00:25:45.114 }, 00:25:45.114 { 00:25:45.114 "name": "BaseBdev2", 00:25:45.114 "uuid": "c4889751-2c5c-52f4-95e5-e64ed7b0f238", 00:25:45.114 "is_configured": true, 00:25:45.114 "data_offset": 256, 00:25:45.114 "data_size": 7936 00:25:45.114 } 00:25:45.114 ] 00:25:45.114 }' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # killprocess 1802642 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 1802642 ']' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 1802642 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1802642 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1802642' 00:25:45.114 killing process with pid 1802642 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 1802642 00:25:45.114 Received shutdown signal, test time was about 60.000000 seconds 00:25:45.114 00:25:45.114 Latency(us) 00:25:45.114 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:45.114 =================================================================================================================== 00:25:45.114 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:45.114 [2024-05-14 12:02:12.195451] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:45.114 [2024-05-14 12:02:12.195548] bdev_raid.c: 448:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:45.114 [2024-05-14 12:02:12.195595] bdev_raid.c: 425:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:45.114 [2024-05-14 12:02:12.195607] bdev_raid.c: 350:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13398b0 name raid_bdev1, state offline 00:25:45.114 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 1802642 00:25:45.374 [2024-05-14 12:02:12.223129] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:45.374 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@797 -- # return 0 00:25:45.374 00:25:45.374 real 0m28.789s 00:25:45.374 user 0m45.869s 00:25:45.374 sys 0m3.818s 00:25:45.374 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:45.374 12:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:45.374 ************************************ 00:25:45.374 END TEST raid_rebuild_test_sb_md_interleaved 00:25:45.374 ************************************ 00:25:45.633 12:02:12 bdev_raid -- bdev/bdev_raid.sh@862 -- # rm -f /raidrandtest 00:25:45.633 00:25:45.633 real 15m58.241s 00:25:45.633 user 27m15.556s 00:25:45.633 sys 2m52.765s 00:25:45.633 12:02:12 bdev_raid -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:45.633 12:02:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:45.633 ************************************ 00:25:45.633 END TEST bdev_raid 00:25:45.633 ************************************ 00:25:45.633 12:02:12 -- spdk/autotest.sh@187 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:45.633 12:02:12 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:25:45.633 12:02:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:45.633 12:02:12 -- common/autotest_common.sh@10 -- # set +x 00:25:45.633 ************************************ 00:25:45.633 START TEST bdevperf_config 00:25:45.633 ************************************ 00:25:45.633 12:02:12 bdevperf_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:45.633 * Looking for test storage... 00:25:45.633 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:45.633 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:45.633 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:45.633 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:45.633 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:45.633 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:45.633 12:02:12 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-05-14 12:02:12.766096] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:48.923 [2024-05-14 12:02:12.766167] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1806946 ] 00:25:48.923 Using job config with 4 jobs 00:25:48.923 [2024-05-14 12:02:12.906909] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.923 [2024-05-14 12:02:13.038592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.923 cpumask for '\''job0'\'' is too big 00:25:48.923 cpumask for '\''job1'\'' is too big 00:25:48.923 cpumask for '\''job2'\'' is too big 00:25:48.923 cpumask for '\''job3'\'' is too big 00:25:48.923 Running I/O for 2 seconds... 00:25:48.923 00:25:48.923 Latency(us) 00:25:48.923 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23599.93 23.05 0.00 0.00 10837.68 1937.59 16754.42 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23578.35 23.03 0.00 0.00 10822.61 1923.34 14816.83 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23556.92 23.00 0.00 0.00 10807.22 1923.34 12879.25 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.03 23629.75 23.08 0.00 0.00 10749.34 926.05 11397.57 00:25:48.923 =================================================================================================================== 00:25:48.923 Total : 94364.94 92.15 0.00 0.00 10804.14 926.05 16754.42' 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-05-14 12:02:12.766096] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:48.923 [2024-05-14 12:02:12.766167] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1806946 ] 00:25:48.923 Using job config with 4 jobs 00:25:48.923 [2024-05-14 12:02:12.906909] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.923 [2024-05-14 12:02:13.038592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.923 cpumask for '\''job0'\'' is too big 00:25:48.923 cpumask for '\''job1'\'' is too big 00:25:48.923 cpumask for '\''job2'\'' is too big 00:25:48.923 cpumask for '\''job3'\'' is too big 00:25:48.923 Running I/O for 2 seconds... 00:25:48.923 00:25:48.923 Latency(us) 00:25:48.923 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23599.93 23.05 0.00 0.00 10837.68 1937.59 16754.42 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23578.35 23.03 0.00 0.00 10822.61 1923.34 14816.83 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23556.92 23.00 0.00 0.00 10807.22 1923.34 12879.25 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.03 23629.75 23.08 0.00 0.00 10749.34 926.05 11397.57 00:25:48.923 =================================================================================================================== 00:25:48.923 Total : 94364.94 92.15 0.00 0.00 10804.14 926.05 16754.42' 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-14 12:02:12.766096] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:48.923 [2024-05-14 12:02:12.766167] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1806946 ] 00:25:48.923 Using job config with 4 jobs 00:25:48.923 [2024-05-14 12:02:12.906909] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.923 [2024-05-14 12:02:13.038592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.923 cpumask for '\''job0'\'' is too big 00:25:48.923 cpumask for '\''job1'\'' is too big 00:25:48.923 cpumask for '\''job2'\'' is too big 00:25:48.923 cpumask for '\''job3'\'' is too big 00:25:48.923 Running I/O for 2 seconds... 00:25:48.923 00:25:48.923 Latency(us) 00:25:48.923 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23599.93 23.05 0.00 0.00 10837.68 1937.59 16754.42 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23578.35 23.03 0.00 0.00 10822.61 1923.34 14816.83 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.02 23556.92 23.00 0.00 0.00 10807.22 1923.34 12879.25 00:25:48.923 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:48.923 Malloc0 : 2.03 23629.75 23.08 0.00 0.00 10749.34 926.05 11397.57 00:25:48.923 =================================================================================================================== 00:25:48.923 Total : 94364.94 92.15 0.00 0.00 10804.14 926.05 16754.42' 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:25:48.923 12:02:15 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:48.923 [2024-05-14 12:02:15.555198] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:48.924 [2024-05-14 12:02:15.555262] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1807307 ] 00:25:48.924 [2024-05-14 12:02:15.698439] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.924 [2024-05-14 12:02:15.819940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.924 cpumask for 'job0' is too big 00:25:48.924 cpumask for 'job1' is too big 00:25:48.924 cpumask for 'job2' is too big 00:25:48.924 cpumask for 'job3' is too big 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:25:51.459 Running I/O for 2 seconds... 00:25:51.459 00:25:51.459 Latency(us) 00:25:51.459 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:51.459 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:51.459 Malloc0 : 2.01 23395.37 22.85 0.00 0.00 10926.01 1909.09 16754.42 00:25:51.459 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:51.459 Malloc0 : 2.02 23405.14 22.86 0.00 0.00 10895.67 1894.85 14930.81 00:25:51.459 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:51.459 Malloc0 : 2.03 23383.96 22.84 0.00 0.00 10880.07 1894.85 12993.22 00:25:51.459 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:51.459 Malloc0 : 2.03 23362.78 22.82 0.00 0.00 10865.26 1937.59 11340.58 00:25:51.459 =================================================================================================================== 00:25:51.459 Total : 93547.25 91.35 0.00 0.00 10891.71 1894.85 16754.42' 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:51.459 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:51.459 00:25:51.459 12:02:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:51.460 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:51.460 12:02:18 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:53.993 12:02:20 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-05-14 12:02:18.326862] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:53.993 [2024-05-14 12:02:18.326929] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1807657 ] 00:25:53.993 Using job config with 3 jobs 00:25:53.993 [2024-05-14 12:02:18.479735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:53.993 [2024-05-14 12:02:18.597619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.993 cpumask for '\''job0'\'' is too big 00:25:53.993 cpumask for '\''job1'\'' is too big 00:25:53.993 cpumask for '\''job2'\'' is too big 00:25:53.993 Running I/O for 2 seconds... 00:25:53.993 00:25:53.993 Latency(us) 00:25:53.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31627.97 30.89 0.00 0.00 8082.01 2194.03 11910.46 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31598.37 30.86 0.00 0.00 8069.87 1866.35 10029.86 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31569.14 30.83 0.00 0.00 8058.83 1852.10 8434.20 00:25:53.993 =================================================================================================================== 00:25:53.993 Total : 94795.49 92.57 0.00 0.00 8070.24 1852.10 11910.46' 00:25:53.993 12:02:21 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-05-14 12:02:18.326862] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:53.993 [2024-05-14 12:02:18.326929] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1807657 ] 00:25:53.993 Using job config with 3 jobs 00:25:53.993 [2024-05-14 12:02:18.479735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:53.993 [2024-05-14 12:02:18.597619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.993 cpumask for '\''job0'\'' is too big 00:25:53.993 cpumask for '\''job1'\'' is too big 00:25:53.993 cpumask for '\''job2'\'' is too big 00:25:53.993 Running I/O for 2 seconds... 00:25:53.993 00:25:53.993 Latency(us) 00:25:53.993 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31627.97 30.89 0.00 0.00 8082.01 2194.03 11910.46 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31598.37 30.86 0.00 0.00 8069.87 1866.35 10029.86 00:25:53.993 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.993 Malloc0 : 2.02 31569.14 30.83 0.00 0.00 8058.83 1852.10 8434.20 00:25:53.993 =================================================================================================================== 00:25:53.993 Total : 94795.49 92.57 0.00 0.00 8070.24 1852.10 11910.46' 00:25:53.993 12:02:21 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-14 12:02:18.326862] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:53.993 [2024-05-14 12:02:18.326929] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1807657 ] 00:25:53.993 Using job config with 3 jobs 00:25:53.993 [2024-05-14 12:02:18.479735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:53.994 [2024-05-14 12:02:18.597619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.994 cpumask for '\''job0'\'' is too big 00:25:53.994 cpumask for '\''job1'\'' is too big 00:25:53.994 cpumask for '\''job2'\'' is too big 00:25:53.994 Running I/O for 2 seconds... 00:25:53.994 00:25:53.994 Latency(us) 00:25:53.994 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:53.994 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.994 Malloc0 : 2.02 31627.97 30.89 0.00 0.00 8082.01 2194.03 11910.46 00:25:53.994 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.994 Malloc0 : 2.02 31598.37 30.86 0.00 0.00 8069.87 1866.35 10029.86 00:25:53.994 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:53.994 Malloc0 : 2.02 31569.14 30.83 0.00 0.00 8058.83 1852.10 8434.20 00:25:53.994 =================================================================================================================== 00:25:53.994 Total : 94795.49 92.57 0.00 0.00 8070.24 1852.10 11910.46' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:53.994 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:53.994 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:53.994 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:53.994 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:53.994 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:53.994 12:02:21 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:57.293 12:02:23 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-05-14 12:02:21.097371] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:57.293 [2024-05-14 12:02:21.097442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1808019 ] 00:25:57.293 Using job config with 4 jobs 00:25:57.293 [2024-05-14 12:02:21.235645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:57.293 [2024-05-14 12:02:21.350751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:57.293 cpumask for '\''job0'\'' is too big 00:25:57.293 cpumask for '\''job1'\'' is too big 00:25:57.293 cpumask for '\''job2'\'' is too big 00:25:57.293 cpumask for '\''job3'\'' is too big 00:25:57.293 Running I/O for 2 seconds... 00:25:57.293 00:25:57.293 Latency(us) 00:25:57.293 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11697.72 11.42 0.00 0.00 21871.56 3932.16 33964.74 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11686.83 11.41 0.00 0.00 21867.88 4758.48 33964.74 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11676.28 11.40 0.00 0.00 21808.75 3903.67 29861.62 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11665.50 11.39 0.00 0.00 21809.35 4758.48 29861.62 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11655.00 11.38 0.00 0.00 21747.01 3903.67 25872.47 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11644.30 11.37 0.00 0.00 21746.07 4729.99 25872.47 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.05 11633.82 11.36 0.00 0.00 21687.57 3875.17 22339.23 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.05 11732.17 11.46 0.00 0.00 21484.41 990.16 22339.23 00:25:57.293 =================================================================================================================== 00:25:57.293 Total : 93391.62 91.20 0.00 0.00 21752.46 990.16 33964.74' 00:25:57.293 12:02:23 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-05-14 12:02:21.097371] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:57.293 [2024-05-14 12:02:21.097442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1808019 ] 00:25:57.293 Using job config with 4 jobs 00:25:57.293 [2024-05-14 12:02:21.235645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:57.293 [2024-05-14 12:02:21.350751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:57.293 cpumask for '\''job0'\'' is too big 00:25:57.293 cpumask for '\''job1'\'' is too big 00:25:57.293 cpumask for '\''job2'\'' is too big 00:25:57.293 cpumask for '\''job3'\'' is too big 00:25:57.293 Running I/O for 2 seconds... 00:25:57.293 00:25:57.293 Latency(us) 00:25:57.293 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11697.72 11.42 0.00 0.00 21871.56 3932.16 33964.74 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11686.83 11.41 0.00 0.00 21867.88 4758.48 33964.74 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11676.28 11.40 0.00 0.00 21808.75 3903.67 29861.62 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11665.50 11.39 0.00 0.00 21809.35 4758.48 29861.62 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.04 11655.00 11.38 0.00 0.00 21747.01 3903.67 25872.47 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc1 : 2.04 11644.30 11.37 0.00 0.00 21746.07 4729.99 25872.47 00:25:57.293 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.293 Malloc0 : 2.05 11633.82 11.36 0.00 0.00 21687.57 3875.17 22339.23 00:25:57.293 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc1 : 2.05 11732.17 11.46 0.00 0.00 21484.41 990.16 22339.23 00:25:57.294 =================================================================================================================== 00:25:57.294 Total : 93391.62 91.20 0.00 0.00 21752.46 990.16 33964.74' 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-14 12:02:21.097371] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:57.294 [2024-05-14 12:02:21.097442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1808019 ] 00:25:57.294 Using job config with 4 jobs 00:25:57.294 [2024-05-14 12:02:21.235645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:57.294 [2024-05-14 12:02:21.350751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:57.294 cpumask for '\''job0'\'' is too big 00:25:57.294 cpumask for '\''job1'\'' is too big 00:25:57.294 cpumask for '\''job2'\'' is too big 00:25:57.294 cpumask for '\''job3'\'' is too big 00:25:57.294 Running I/O for 2 seconds... 00:25:57.294 00:25:57.294 Latency(us) 00:25:57.294 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:57.294 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc0 : 2.04 11697.72 11.42 0.00 0.00 21871.56 3932.16 33964.74 00:25:57.294 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc1 : 2.04 11686.83 11.41 0.00 0.00 21867.88 4758.48 33964.74 00:25:57.294 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc0 : 2.04 11676.28 11.40 0.00 0.00 21808.75 3903.67 29861.62 00:25:57.294 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc1 : 2.04 11665.50 11.39 0.00 0.00 21809.35 4758.48 29861.62 00:25:57.294 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc0 : 2.04 11655.00 11.38 0.00 0.00 21747.01 3903.67 25872.47 00:25:57.294 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc1 : 2.04 11644.30 11.37 0.00 0.00 21746.07 4729.99 25872.47 00:25:57.294 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc0 : 2.05 11633.82 11.36 0.00 0.00 21687.57 3875.17 22339.23 00:25:57.294 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:57.294 Malloc1 : 2.05 11732.17 11.46 0.00 0.00 21484.41 990.16 22339.23 00:25:57.294 =================================================================================================================== 00:25:57.294 Total : 93391.62 91.20 0.00 0.00 21752.46 990.16 33964.74' 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:57.294 12:02:23 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:25:57.294 00:25:57.294 real 0m11.282s 00:25:57.294 user 0m9.945s 00:25:57.294 sys 0m1.186s 00:25:57.294 12:02:23 bdevperf_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:57.294 12:02:23 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:25:57.294 ************************************ 00:25:57.294 END TEST bdevperf_config 00:25:57.294 ************************************ 00:25:57.294 12:02:23 -- spdk/autotest.sh@188 -- # uname -s 00:25:57.294 12:02:23 -- spdk/autotest.sh@188 -- # [[ Linux == Linux ]] 00:25:57.294 12:02:23 -- spdk/autotest.sh@189 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:57.294 12:02:23 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:25:57.294 12:02:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:57.294 12:02:23 -- common/autotest_common.sh@10 -- # set +x 00:25:57.294 ************************************ 00:25:57.294 START TEST reactor_set_interrupt 00:25:57.294 ************************************ 00:25:57.294 12:02:23 reactor_set_interrupt -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:57.294 * Looking for test storage... 00:25:57.294 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:57.294 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:25:57.294 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:25:57.294 12:02:24 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:25:57.295 12:02:24 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:25:57.295 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:25:57.295 #define SPDK_CONFIG_H 00:25:57.295 #define SPDK_CONFIG_APPS 1 00:25:57.295 #define SPDK_CONFIG_ARCH native 00:25:57.295 #undef SPDK_CONFIG_ASAN 00:25:57.295 #undef SPDK_CONFIG_AVAHI 00:25:57.295 #undef SPDK_CONFIG_CET 00:25:57.295 #define SPDK_CONFIG_COVERAGE 1 00:25:57.295 #define SPDK_CONFIG_CROSS_PREFIX 00:25:57.295 #define SPDK_CONFIG_CRYPTO 1 00:25:57.295 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:25:57.295 #undef SPDK_CONFIG_CUSTOMOCF 00:25:57.295 #undef SPDK_CONFIG_DAOS 00:25:57.295 #define SPDK_CONFIG_DAOS_DIR 00:25:57.295 #define SPDK_CONFIG_DEBUG 1 00:25:57.295 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:25:57.295 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:57.295 #define SPDK_CONFIG_DPDK_INC_DIR 00:25:57.295 #define SPDK_CONFIG_DPDK_LIB_DIR 00:25:57.295 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:25:57.295 #undef SPDK_CONFIG_DPDK_UADK 00:25:57.295 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:57.295 #define SPDK_CONFIG_EXAMPLES 1 00:25:57.295 #undef SPDK_CONFIG_FC 00:25:57.295 #define SPDK_CONFIG_FC_PATH 00:25:57.295 #define SPDK_CONFIG_FIO_PLUGIN 1 00:25:57.295 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:25:57.295 #undef SPDK_CONFIG_FUSE 00:25:57.295 #undef SPDK_CONFIG_FUZZER 00:25:57.295 #define SPDK_CONFIG_FUZZER_LIB 00:25:57.295 #undef SPDK_CONFIG_GOLANG 00:25:57.295 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:25:57.295 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:25:57.295 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:25:57.295 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:25:57.295 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:25:57.295 #undef SPDK_CONFIG_HAVE_LIBBSD 00:25:57.295 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:25:57.295 #define SPDK_CONFIG_IDXD 1 00:25:57.295 #undef SPDK_CONFIG_IDXD_KERNEL 00:25:57.295 #define SPDK_CONFIG_IPSEC_MB 1 00:25:57.295 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:57.295 #define SPDK_CONFIG_ISAL 1 00:25:57.295 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:25:57.295 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:25:57.295 #define SPDK_CONFIG_LIBDIR 00:25:57.295 #undef SPDK_CONFIG_LTO 00:25:57.295 #define SPDK_CONFIG_MAX_LCORES 00:25:57.295 #define SPDK_CONFIG_NVME_CUSE 1 00:25:57.295 #undef SPDK_CONFIG_OCF 00:25:57.295 #define SPDK_CONFIG_OCF_PATH 00:25:57.295 #define SPDK_CONFIG_OPENSSL_PATH 00:25:57.295 #undef SPDK_CONFIG_PGO_CAPTURE 00:25:57.295 #define SPDK_CONFIG_PGO_DIR 00:25:57.295 #undef SPDK_CONFIG_PGO_USE 00:25:57.295 #define SPDK_CONFIG_PREFIX /usr/local 00:25:57.295 #undef SPDK_CONFIG_RAID5F 00:25:57.295 #undef SPDK_CONFIG_RBD 00:25:57.295 #define SPDK_CONFIG_RDMA 1 00:25:57.295 #define SPDK_CONFIG_RDMA_PROV verbs 00:25:57.295 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:25:57.295 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:25:57.295 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:25:57.295 #define SPDK_CONFIG_SHARED 1 00:25:57.295 #undef SPDK_CONFIG_SMA 00:25:57.295 #define SPDK_CONFIG_TESTS 1 00:25:57.295 #undef SPDK_CONFIG_TSAN 00:25:57.295 #define SPDK_CONFIG_UBLK 1 00:25:57.295 #define SPDK_CONFIG_UBSAN 1 00:25:57.295 #undef SPDK_CONFIG_UNIT_TESTS 00:25:57.295 #undef SPDK_CONFIG_URING 00:25:57.295 #define SPDK_CONFIG_URING_PATH 00:25:57.295 #undef SPDK_CONFIG_URING_ZNS 00:25:57.295 #undef SPDK_CONFIG_USDT 00:25:57.295 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:25:57.295 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:25:57.295 #undef SPDK_CONFIG_VFIO_USER 00:25:57.295 #define SPDK_CONFIG_VFIO_USER_DIR 00:25:57.295 #define SPDK_CONFIG_VHOST 1 00:25:57.295 #define SPDK_CONFIG_VIRTIO 1 00:25:57.295 #undef SPDK_CONFIG_VTUNE 00:25:57.295 #define SPDK_CONFIG_VTUNE_DIR 00:25:57.295 #define SPDK_CONFIG_WERROR 1 00:25:57.295 #define SPDK_CONFIG_WPDK_DIR 00:25:57.295 #undef SPDK_CONFIG_XNVME 00:25:57.295 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:25:57.295 12:02:24 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:25:57.295 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:57.295 12:02:24 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:57.295 12:02:24 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:57.295 12:02:24 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:57.295 12:02:24 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:57.295 12:02:24 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:57.295 12:02:24 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:57.295 12:02:24 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:25:57.295 12:02:24 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:57.295 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:25:57.296 12:02:24 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@57 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@61 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@63 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@65 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@67 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@69 -- # : 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@71 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@73 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@75 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@77 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@79 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@81 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@83 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@85 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@87 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@89 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@91 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@93 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@95 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@97 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@99 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@101 -- # : rdma 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@103 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@105 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@107 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@109 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@111 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@113 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@115 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@117 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@119 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@121 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@123 -- # : 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@125 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@127 -- # : 1 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@129 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@131 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@133 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@135 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@137 -- # : 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@139 -- # : true 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@141 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@143 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@145 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@147 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@149 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@151 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@153 -- # : 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@155 -- # : 0 00:25:57.296 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@157 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@159 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@161 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@163 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@168 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@170 -- # : 0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@199 -- # cat 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@262 -- # export valgrind= 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@262 -- # valgrind= 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@268 -- # uname -s 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@278 -- # MAKE=make 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@298 -- # TEST_MODE= 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@317 -- # [[ -z 1808410 ]] 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@317 -- # kill -0 1808410 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local mount target_dir 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.GyFl89 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.GyFl89/tests/interrupt /tmp/spdk.GyFl89 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@326 -- # df -T 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:25:57.297 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=969789440 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4314640384 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=88914075648 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508531712 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=5594456064 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=47249555456 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=18892279808 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901708800 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=9428992 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=47253630976 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=634880 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:25:57.298 * Looking for test storage... 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@367 -- # local target_space new_size 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@371 -- # mount=/ 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@373 -- # target_space=88914075648 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@380 -- # new_size=7809048576 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.298 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@388 -- # return 0 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1678 -- # set -o errtrace 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # true 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # xtrace_fd 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:25:57.298 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1808451 00:25:57.298 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:57.299 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1808451 /var/tmp/spdk.sock 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 1808451 ']' 00:25:57.299 12:02:24 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:57.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:57.299 12:02:24 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:57.299 [2024-05-14 12:02:24.279779] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:25:57.299 [2024-05-14 12:02:24.279844] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1808451 ] 00:25:57.558 [2024-05-14 12:02:24.394703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:57.558 [2024-05-14 12:02:24.494927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:57.558 [2024-05-14 12:02:24.495013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:57.558 [2024-05-14 12:02:24.495017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:57.558 [2024-05-14 12:02:24.567134] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:58.494 12:02:25 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:58.494 12:02:25 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:58.494 Malloc0 00:25:58.494 Malloc1 00:25:58.494 Malloc2 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:25:58.494 5000+0 records in 00:25:58.494 5000+0 records out 00:25:58.494 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0275965 s, 371 MB/s 00:25:58.494 12:02:25 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:25:58.753 AIO0 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1808451 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1808451 without_thd 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1808451 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:58.753 12:02:25 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:59.011 12:02:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:25:59.270 spdk_thread ids are 1 on reactor0. 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1808451 0 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1808451 0 idle 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:25:59.270 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808451 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0' 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808451 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1808451 1 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1808451 1 idle 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:25:59.537 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:25:59.798 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808454 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:25:59.798 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808454 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1808451 2 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1808451 2 idle 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808455 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808455 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:59.799 12:02:26 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:26:00.058 12:02:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:26:00.058 [2024-05-14 12:02:27.111922] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:00.058 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:00.317 [2024-05-14 12:02:27.355651] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:00.317 [2024-05-14 12:02:27.356005] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:00.317 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:00.575 [2024-05-14 12:02:27.599553] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:00.575 [2024-05-14 12:02:27.599674] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:00.575 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:00.575 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1808451 0 00:26:00.575 12:02:27 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1808451 0 busy 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:26:00.576 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808451 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0' 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808451 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1808451 2 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1808451 2 busy 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:26:00.842 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808455 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808455 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:01.107 12:02:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:01.107 [2024-05-14 12:02:28.147542] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:01.107 [2024-05-14 12:02:28.147648] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1808451 2 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1808451 2 idle 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:26:01.107 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808455 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.54 reactor_2' 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808455 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.54 reactor_2 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:01.365 12:02:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:01.366 12:02:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:01.366 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:01.625 [2024-05-14 12:02:28.571531] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:01.625 [2024-05-14 12:02:28.571692] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:01.625 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:26:01.625 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:26:01.625 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:26:01.883 [2024-05-14 12:02:28.811720] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1808451 0 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1808451 0 idle 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1808451 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1808451 -w 256 00:26:01.883 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:02.141 12:02:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1808451 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0' 00:26:02.141 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1808451 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0 00:26:02.141 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:02.141 12:02:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:26:02.141 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1808451 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 1808451 ']' 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 1808451 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1808451 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1808451' 00:26:02.141 killing process with pid 1808451 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 1808451 00:26:02.141 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 1808451 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1809225 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:02.400 12:02:29 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1809225 /var/tmp/spdk.sock 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 1809225 ']' 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:02.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:02.400 12:02:29 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:02.400 [2024-05-14 12:02:29.343656] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:02.400 [2024-05-14 12:02:29.343730] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1809225 ] 00:26:02.400 [2024-05-14 12:02:29.465009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:02.659 [2024-05-14 12:02:29.570332] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:02.659 [2024-05-14 12:02:29.570422] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:02.659 [2024-05-14 12:02:29.570427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:02.659 [2024-05-14 12:02:29.641457] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:03.226 12:02:30 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:03.226 12:02:30 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:26:03.226 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:26:03.226 12:02:30 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:03.487 Malloc0 00:26:03.487 Malloc1 00:26:03.487 Malloc2 00:26:03.487 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:26:03.487 12:02:30 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:03.487 12:02:30 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:03.487 12:02:30 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:03.487 5000+0 records in 00:26:03.487 5000+0 records out 00:26:03.487 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0273076 s, 375 MB/s 00:26:03.487 12:02:30 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:03.746 AIO0 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1809225 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1809225 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1809225 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:03.746 12:02:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:04.005 12:02:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:04.263 spdk_thread ids are 1 on reactor0. 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1809225 0 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1809225 0 idle 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:04.263 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:04.264 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0' 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1809225 1 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1809225 1 idle 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809228 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809228 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:04.527 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1809225 2 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1809225 2 idle 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809229 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809229 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:26:04.786 12:02:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:05.046 [2024-05-14 12:02:32.027011] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:05.046 [2024-05-14 12:02:32.027191] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:26:05.046 [2024-05-14 12:02:32.027331] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:05.046 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:05.305 [2024-05-14 12:02:32.203375] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:05.305 [2024-05-14 12:02:32.203531] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1809225 0 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1809225 0 busy 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:05.305 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809225 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.75 reactor_0' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809225 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.75 reactor_0 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1809225 2 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1809225 2 busy 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809229 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.36 reactor_2' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809229 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.36 reactor_2 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:05.569 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:05.826 [2024-05-14 12:02:32.813106] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:05.826 [2024-05-14 12:02:32.813202] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1809225 2 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1809225 2 idle 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:05.826 12:02:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:06.083 12:02:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809229 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809229 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:06.084 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:06.342 [2024-05-14 12:02:33.246218] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:06.342 [2024-05-14 12:02:33.246420] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:26:06.342 [2024-05-14 12:02:33.246443] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1809225 0 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1809225 0 idle 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1809225 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1809225 -w 256 00:26:06.342 12:02:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1809225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0' 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1809225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:26:06.601 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1809225 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 1809225 ']' 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 1809225 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1809225 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1809225' 00:26:06.601 killing process with pid 1809225 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 1809225 00:26:06.601 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 1809225 00:26:06.859 12:02:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:26:06.859 12:02:33 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:06.859 00:26:06.859 real 0m9.822s 00:26:06.859 user 0m9.108s 00:26:06.859 sys 0m2.097s 00:26:06.859 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:06.859 12:02:33 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:06.859 ************************************ 00:26:06.859 END TEST reactor_set_interrupt 00:26:06.859 ************************************ 00:26:06.859 12:02:33 -- spdk/autotest.sh@190 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:06.859 12:02:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:26:06.859 12:02:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:06.859 12:02:33 -- common/autotest_common.sh@10 -- # set +x 00:26:06.859 ************************************ 00:26:06.859 START TEST reap_unregistered_poller 00:26:06.859 ************************************ 00:26:06.859 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:07.121 * Looking for test storage... 00:26:07.121 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:07.121 12:02:33 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:07.121 12:02:33 reap_unregistered_poller -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:07.121 12:02:33 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:26:07.122 12:02:33 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:07.122 12:02:34 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:07.122 12:02:34 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:07.122 12:02:34 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:07.122 12:02:34 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:07.122 12:02:34 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:07.122 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:07.122 12:02:34 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:07.122 #define SPDK_CONFIG_H 00:26:07.122 #define SPDK_CONFIG_APPS 1 00:26:07.122 #define SPDK_CONFIG_ARCH native 00:26:07.122 #undef SPDK_CONFIG_ASAN 00:26:07.122 #undef SPDK_CONFIG_AVAHI 00:26:07.122 #undef SPDK_CONFIG_CET 00:26:07.122 #define SPDK_CONFIG_COVERAGE 1 00:26:07.122 #define SPDK_CONFIG_CROSS_PREFIX 00:26:07.122 #define SPDK_CONFIG_CRYPTO 1 00:26:07.122 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:07.122 #undef SPDK_CONFIG_CUSTOMOCF 00:26:07.122 #undef SPDK_CONFIG_DAOS 00:26:07.122 #define SPDK_CONFIG_DAOS_DIR 00:26:07.122 #define SPDK_CONFIG_DEBUG 1 00:26:07.122 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:07.122 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:07.122 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:07.122 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:07.122 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:07.122 #undef SPDK_CONFIG_DPDK_UADK 00:26:07.122 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:07.122 #define SPDK_CONFIG_EXAMPLES 1 00:26:07.122 #undef SPDK_CONFIG_FC 00:26:07.122 #define SPDK_CONFIG_FC_PATH 00:26:07.122 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:07.122 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:07.122 #undef SPDK_CONFIG_FUSE 00:26:07.122 #undef SPDK_CONFIG_FUZZER 00:26:07.122 #define SPDK_CONFIG_FUZZER_LIB 00:26:07.122 #undef SPDK_CONFIG_GOLANG 00:26:07.122 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:07.122 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:07.122 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:07.122 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:26:07.122 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:07.122 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:07.122 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:07.122 #define SPDK_CONFIG_IDXD 1 00:26:07.122 #undef SPDK_CONFIG_IDXD_KERNEL 00:26:07.122 #define SPDK_CONFIG_IPSEC_MB 1 00:26:07.122 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:07.122 #define SPDK_CONFIG_ISAL 1 00:26:07.122 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:07.122 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:07.122 #define SPDK_CONFIG_LIBDIR 00:26:07.122 #undef SPDK_CONFIG_LTO 00:26:07.122 #define SPDK_CONFIG_MAX_LCORES 00:26:07.122 #define SPDK_CONFIG_NVME_CUSE 1 00:26:07.122 #undef SPDK_CONFIG_OCF 00:26:07.122 #define SPDK_CONFIG_OCF_PATH 00:26:07.122 #define SPDK_CONFIG_OPENSSL_PATH 00:26:07.122 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:07.122 #define SPDK_CONFIG_PGO_DIR 00:26:07.122 #undef SPDK_CONFIG_PGO_USE 00:26:07.122 #define SPDK_CONFIG_PREFIX /usr/local 00:26:07.122 #undef SPDK_CONFIG_RAID5F 00:26:07.122 #undef SPDK_CONFIG_RBD 00:26:07.122 #define SPDK_CONFIG_RDMA 1 00:26:07.122 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:07.122 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:07.122 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:07.122 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:07.122 #define SPDK_CONFIG_SHARED 1 00:26:07.122 #undef SPDK_CONFIG_SMA 00:26:07.122 #define SPDK_CONFIG_TESTS 1 00:26:07.122 #undef SPDK_CONFIG_TSAN 00:26:07.122 #define SPDK_CONFIG_UBLK 1 00:26:07.122 #define SPDK_CONFIG_UBSAN 1 00:26:07.122 #undef SPDK_CONFIG_UNIT_TESTS 00:26:07.122 #undef SPDK_CONFIG_URING 00:26:07.122 #define SPDK_CONFIG_URING_PATH 00:26:07.122 #undef SPDK_CONFIG_URING_ZNS 00:26:07.122 #undef SPDK_CONFIG_USDT 00:26:07.122 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:07.123 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:07.123 #undef SPDK_CONFIG_VFIO_USER 00:26:07.123 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:07.123 #define SPDK_CONFIG_VHOST 1 00:26:07.123 #define SPDK_CONFIG_VIRTIO 1 00:26:07.123 #undef SPDK_CONFIG_VTUNE 00:26:07.123 #define SPDK_CONFIG_VTUNE_DIR 00:26:07.123 #define SPDK_CONFIG_WERROR 1 00:26:07.123 #define SPDK_CONFIG_WPDK_DIR 00:26:07.123 #undef SPDK_CONFIG_XNVME 00:26:07.123 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:07.123 12:02:34 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:07.123 12:02:34 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:07.123 12:02:34 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:07.123 12:02:34 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:07.123 12:02:34 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:26:07.123 12:02:34 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:07.123 12:02:34 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@57 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@61 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@63 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@65 -- # : 1 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@67 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@69 -- # : 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@71 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@73 -- # : 1 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@75 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@77 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@79 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@81 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@83 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@85 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@87 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@89 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@91 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@93 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@95 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@97 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@99 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@101 -- # : rdma 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@103 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@105 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@107 -- # : 1 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@109 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@111 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@113 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@115 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@117 -- # : 1 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@119 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@121 -- # : 1 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@123 -- # : 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@125 -- # : 0 00:26:07.123 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@127 -- # : 1 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@129 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@131 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@133 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@135 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@137 -- # : 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@139 -- # : true 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@141 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@143 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@145 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@147 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@149 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@151 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@153 -- # : 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@155 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@157 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@159 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@161 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@163 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@168 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@170 -- # : 0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@199 -- # cat 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@262 -- # export valgrind= 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@262 -- # valgrind= 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@268 -- # uname -s 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@278 -- # MAKE=make 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:26:07.124 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@298 -- # TEST_MODE= 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@317 -- # [[ -z 1809865 ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@317 -- # kill -0 1809865 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local mount target_dir 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.mtCWxs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.mtCWxs/tests/interrupt /tmp/spdk.mtCWxs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@326 -- # df -T 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=969789440 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4314640384 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=88913928192 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508531712 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=5594603520 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=47249555456 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=18892279808 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901708800 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=9428992 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=47253630976 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254265856 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=634880 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:26:07.125 * Looking for test storage... 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@367 -- # local target_space new_size 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@371 -- # mount=/ 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@373 -- # target_space=88913928192 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@380 -- # new_size=7809196032 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.125 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@388 -- # return 0 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1678 -- # set -o errtrace 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # true 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # xtrace_fd 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:07.125 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:07.125 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1809906 00:26:07.126 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:07.126 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:07.126 12:02:34 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1809906 /var/tmp/spdk.sock 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@827 -- # '[' -z 1809906 ']' 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:07.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:07.126 12:02:34 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:07.126 [2024-05-14 12:02:34.191997] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:07.126 [2024-05-14 12:02:34.192063] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1809906 ] 00:26:07.385 [2024-05-14 12:02:34.322440] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:07.385 [2024-05-14 12:02:34.421896] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:07.385 [2024-05-14 12:02:34.421914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:07.385 [2024-05-14 12:02:34.421918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:07.645 [2024-05-14 12:02:34.493478] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:08.216 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:08.216 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@860 -- # return 0 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:26:08.216 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:08.216 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:08.216 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:26:08.216 "name": "app_thread", 00:26:08.216 "id": 1, 00:26:08.216 "active_pollers": [], 00:26:08.216 "timed_pollers": [ 00:26:08.216 { 00:26:08.216 "name": "rpc_subsystem_poll_servers", 00:26:08.216 "id": 1, 00:26:08.216 "state": "waiting", 00:26:08.216 "run_count": 0, 00:26:08.216 "busy_count": 0, 00:26:08.216 "period_ticks": 9200000 00:26:08.216 } 00:26:08.216 ], 00:26:08.216 "paused_pollers": [] 00:26:08.216 }' 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:08.216 5000+0 records in 00:26:08.216 5000+0 records out 00:26:08.216 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0255682 s, 400 MB/s 00:26:08.216 12:02:35 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:08.475 AIO0 00:26:08.475 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:08.475 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:26:08.734 "name": "app_thread", 00:26:08.734 "id": 1, 00:26:08.734 "active_pollers": [], 00:26:08.734 "timed_pollers": [ 00:26:08.734 { 00:26:08.734 "name": "rpc_subsystem_poll_servers", 00:26:08.734 "id": 1, 00:26:08.734 "state": "waiting", 00:26:08.734 "run_count": 0, 00:26:08.734 "busy_count": 0, 00:26:08.734 "period_ticks": 9200000 00:26:08.734 } 00:26:08.734 ], 00:26:08.734 "paused_pollers": [] 00:26:08.734 }' 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:26:08.734 12:02:35 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1809906 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@946 -- # '[' -z 1809906 ']' 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@950 -- # kill -0 1809906 00:26:08.734 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@951 -- # uname 00:26:08.735 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:08.735 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1809906 00:26:08.994 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:08.994 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:08.994 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1809906' 00:26:08.994 killing process with pid 1809906 00:26:08.994 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@965 -- # kill 1809906 00:26:08.994 12:02:35 reap_unregistered_poller -- common/autotest_common.sh@970 -- # wait 1809906 00:26:09.253 12:02:36 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:26:09.253 12:02:36 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:09.253 00:26:09.253 real 0m2.239s 00:26:09.253 user 0m1.343s 00:26:09.253 sys 0m0.658s 00:26:09.253 12:02:36 reap_unregistered_poller -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:09.253 12:02:36 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:26:09.253 ************************************ 00:26:09.253 END TEST reap_unregistered_poller 00:26:09.253 ************************************ 00:26:09.253 12:02:36 -- spdk/autotest.sh@194 -- # uname -s 00:26:09.253 12:02:36 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:26:09.253 12:02:36 -- spdk/autotest.sh@195 -- # [[ 1 -eq 1 ]] 00:26:09.253 12:02:36 -- spdk/autotest.sh@201 -- # [[ 1 -eq 0 ]] 00:26:09.253 12:02:36 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:26:09.253 12:02:36 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:26:09.253 12:02:36 -- spdk/autotest.sh@256 -- # timing_exit lib 00:26:09.254 12:02:36 -- common/autotest_common.sh@726 -- # xtrace_disable 00:26:09.254 12:02:36 -- common/autotest_common.sh@10 -- # set +x 00:26:09.254 12:02:36 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@275 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@304 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@317 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@326 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@331 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@343 -- # '[' 1 -eq 1 ']' 00:26:09.254 12:02:36 -- spdk/autotest.sh@344 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:09.254 12:02:36 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:09.254 12:02:36 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:09.254 12:02:36 -- common/autotest_common.sh@10 -- # set +x 00:26:09.254 ************************************ 00:26:09.254 START TEST compress_compdev 00:26:09.254 ************************************ 00:26:09.254 12:02:36 compress_compdev -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:26:09.513 * Looking for test storage... 00:26:09.513 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:09.513 12:02:36 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:09.513 12:02:36 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:09.513 12:02:36 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:09.513 12:02:36 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:09.513 12:02:36 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:09.513 12:02:36 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:09.513 12:02:36 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:09.513 12:02:36 compress_compdev -- paths/export.sh@5 -- # export PATH 00:26:09.513 12:02:36 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:09.513 12:02:36 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:09.513 12:02:36 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:09.513 12:02:36 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1810341 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1810341 00:26:09.514 12:02:36 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 1810341 ']' 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:09.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:09.514 12:02:36 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:09.514 [2024-05-14 12:02:36.456997] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:09.514 [2024-05-14 12:02:36.457069] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1810341 ] 00:26:09.514 [2024-05-14 12:02:36.579619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:09.773 [2024-05-14 12:02:36.684064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:09.773 [2024-05-14 12:02:36.684070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:10.711 [2024-05-14 12:02:37.434552] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:10.711 12:02:37 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:10.711 12:02:37 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:26:10.711 12:02:37 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:26:10.711 12:02:37 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:10.711 12:02:37 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:11.279 [2024-05-14 12:02:38.080538] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17ed170 PMD being used: compress_qat 00:26:11.279 12:02:38 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:11.279 12:02:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:11.538 [ 00:26:11.538 { 00:26:11.538 "name": "Nvme0n1", 00:26:11.538 "aliases": [ 00:26:11.538 "01000000-0000-0000-5cd2-e43197705251" 00:26:11.538 ], 00:26:11.538 "product_name": "NVMe disk", 00:26:11.538 "block_size": 512, 00:26:11.538 "num_blocks": 15002931888, 00:26:11.538 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:11.538 "assigned_rate_limits": { 00:26:11.538 "rw_ios_per_sec": 0, 00:26:11.538 "rw_mbytes_per_sec": 0, 00:26:11.538 "r_mbytes_per_sec": 0, 00:26:11.538 "w_mbytes_per_sec": 0 00:26:11.538 }, 00:26:11.538 "claimed": false, 00:26:11.538 "zoned": false, 00:26:11.538 "supported_io_types": { 00:26:11.539 "read": true, 00:26:11.539 "write": true, 00:26:11.539 "unmap": true, 00:26:11.539 "write_zeroes": true, 00:26:11.539 "flush": true, 00:26:11.539 "reset": true, 00:26:11.539 "compare": false, 00:26:11.539 "compare_and_write": false, 00:26:11.539 "abort": true, 00:26:11.539 "nvme_admin": true, 00:26:11.539 "nvme_io": true 00:26:11.539 }, 00:26:11.539 "driver_specific": { 00:26:11.539 "nvme": [ 00:26:11.539 { 00:26:11.539 "pci_address": "0000:5e:00.0", 00:26:11.539 "trid": { 00:26:11.539 "trtype": "PCIe", 00:26:11.539 "traddr": "0000:5e:00.0" 00:26:11.539 }, 00:26:11.539 "ctrlr_data": { 00:26:11.539 "cntlid": 0, 00:26:11.539 "vendor_id": "0x8086", 00:26:11.539 "model_number": "INTEL SSDPF2KX076TZO", 00:26:11.539 "serial_number": "PHAC0301002G7P6CGN", 00:26:11.539 "firmware_revision": "JCV10200", 00:26:11.539 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:11.539 "oacs": { 00:26:11.539 "security": 1, 00:26:11.539 "format": 1, 00:26:11.539 "firmware": 1, 00:26:11.539 "ns_manage": 1 00:26:11.539 }, 00:26:11.539 "multi_ctrlr": false, 00:26:11.539 "ana_reporting": false 00:26:11.539 }, 00:26:11.539 "vs": { 00:26:11.539 "nvme_version": "1.3" 00:26:11.539 }, 00:26:11.539 "ns_data": { 00:26:11.539 "id": 1, 00:26:11.539 "can_share": false 00:26:11.539 }, 00:26:11.539 "security": { 00:26:11.539 "opal": true 00:26:11.539 } 00:26:11.539 } 00:26:11.539 ], 00:26:11.539 "mp_policy": "active_passive" 00:26:11.539 } 00:26:11.539 } 00:26:11.539 ] 00:26:11.539 12:02:38 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:11.539 12:02:38 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:11.798 [2024-05-14 12:02:38.725818] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1624fc0 PMD being used: compress_qat 00:26:14.331 e9b9567d-3847-4105-96a5-4396380d6e48 00:26:14.331 12:02:40 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:14.331 d2746fcc-4578-4097-b115-82bc7b392fdb 00:26:14.331 12:02:41 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:14.331 12:02:41 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:14.589 12:02:41 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:14.589 [ 00:26:14.589 { 00:26:14.589 "name": "d2746fcc-4578-4097-b115-82bc7b392fdb", 00:26:14.589 "aliases": [ 00:26:14.589 "lvs0/lv0" 00:26:14.589 ], 00:26:14.589 "product_name": "Logical Volume", 00:26:14.589 "block_size": 512, 00:26:14.589 "num_blocks": 204800, 00:26:14.589 "uuid": "d2746fcc-4578-4097-b115-82bc7b392fdb", 00:26:14.589 "assigned_rate_limits": { 00:26:14.589 "rw_ios_per_sec": 0, 00:26:14.589 "rw_mbytes_per_sec": 0, 00:26:14.589 "r_mbytes_per_sec": 0, 00:26:14.589 "w_mbytes_per_sec": 0 00:26:14.589 }, 00:26:14.589 "claimed": false, 00:26:14.589 "zoned": false, 00:26:14.589 "supported_io_types": { 00:26:14.589 "read": true, 00:26:14.589 "write": true, 00:26:14.589 "unmap": true, 00:26:14.589 "write_zeroes": true, 00:26:14.589 "flush": false, 00:26:14.589 "reset": true, 00:26:14.590 "compare": false, 00:26:14.590 "compare_and_write": false, 00:26:14.590 "abort": false, 00:26:14.590 "nvme_admin": false, 00:26:14.590 "nvme_io": false 00:26:14.590 }, 00:26:14.590 "driver_specific": { 00:26:14.590 "lvol": { 00:26:14.590 "lvol_store_uuid": "e9b9567d-3847-4105-96a5-4396380d6e48", 00:26:14.590 "base_bdev": "Nvme0n1", 00:26:14.590 "thin_provision": true, 00:26:14.590 "num_allocated_clusters": 0, 00:26:14.590 "snapshot": false, 00:26:14.590 "clone": false, 00:26:14.590 "esnap_clone": false 00:26:14.590 } 00:26:14.590 } 00:26:14.590 } 00:26:14.590 ] 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:14.849 12:02:41 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:14.849 12:02:41 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:14.849 [2024-05-14 12:02:41.900573] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:14.849 COMP_lvs0/lv0 00:26:14.849 12:02:41 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:14.849 12:02:41 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:15.108 12:02:42 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:15.367 [ 00:26:15.367 { 00:26:15.367 "name": "COMP_lvs0/lv0", 00:26:15.367 "aliases": [ 00:26:15.367 "3c3ab0eb-d794-59d5-a968-2944d4b32b11" 00:26:15.367 ], 00:26:15.367 "product_name": "compress", 00:26:15.367 "block_size": 512, 00:26:15.367 "num_blocks": 200704, 00:26:15.367 "uuid": "3c3ab0eb-d794-59d5-a968-2944d4b32b11", 00:26:15.367 "assigned_rate_limits": { 00:26:15.367 "rw_ios_per_sec": 0, 00:26:15.367 "rw_mbytes_per_sec": 0, 00:26:15.367 "r_mbytes_per_sec": 0, 00:26:15.367 "w_mbytes_per_sec": 0 00:26:15.367 }, 00:26:15.367 "claimed": false, 00:26:15.367 "zoned": false, 00:26:15.367 "supported_io_types": { 00:26:15.367 "read": true, 00:26:15.367 "write": true, 00:26:15.367 "unmap": false, 00:26:15.367 "write_zeroes": true, 00:26:15.367 "flush": false, 00:26:15.367 "reset": false, 00:26:15.367 "compare": false, 00:26:15.367 "compare_and_write": false, 00:26:15.367 "abort": false, 00:26:15.367 "nvme_admin": false, 00:26:15.367 "nvme_io": false 00:26:15.367 }, 00:26:15.367 "driver_specific": { 00:26:15.367 "compress": { 00:26:15.367 "name": "COMP_lvs0/lv0", 00:26:15.367 "base_bdev_name": "d2746fcc-4578-4097-b115-82bc7b392fdb" 00:26:15.367 } 00:26:15.367 } 00:26:15.367 } 00:26:15.367 ] 00:26:15.367 12:02:42 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:15.367 12:02:42 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:15.648 [2024-05-14 12:02:42.530736] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2c9c1b15a0 PMD being used: compress_qat 00:26:15.648 [2024-05-14 12:02:42.532943] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16ce170 PMD being used: compress_qat 00:26:15.648 Running I/O for 3 seconds... 00:26:18.939 00:26:18.939 Latency(us) 00:26:18.939 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:18.939 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:18.939 Verification LBA range: start 0x0 length 0x3100 00:26:18.939 COMP_lvs0/lv0 : 3.00 5030.43 19.65 0.00 0.00 6307.31 573.44 6211.67 00:26:18.939 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:18.939 Verification LBA range: start 0x3100 length 0x3100 00:26:18.939 COMP_lvs0/lv0 : 3.00 5308.60 20.74 0.00 0.00 5990.44 373.98 5556.31 00:26:18.939 =================================================================================================================== 00:26:18.939 Total : 10339.03 40.39 0.00 0.00 6144.63 373.98 6211.67 00:26:18.939 0 00:26:18.939 12:02:45 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:18.939 12:02:45 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:18.939 12:02:45 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:18.939 12:02:46 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:18.939 12:02:46 compress_compdev -- compress/compress.sh@78 -- # killprocess 1810341 00:26:18.939 12:02:46 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 1810341 ']' 00:26:18.939 12:02:46 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 1810341 00:26:18.939 12:02:46 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:26:18.939 12:02:46 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:18.939 12:02:46 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1810341 00:26:19.199 12:02:46 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:19.199 12:02:46 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:19.199 12:02:46 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1810341' 00:26:19.199 killing process with pid 1810341 00:26:19.199 12:02:46 compress_compdev -- common/autotest_common.sh@965 -- # kill 1810341 00:26:19.199 Received shutdown signal, test time was about 3.000000 seconds 00:26:19.199 00:26:19.199 Latency(us) 00:26:19.199 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:19.199 =================================================================================================================== 00:26:19.199 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:19.199 12:02:46 compress_compdev -- common/autotest_common.sh@970 -- # wait 1810341 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1811951 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:22.491 12:02:49 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1811951 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 1811951 ']' 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:22.491 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:22.491 12:02:49 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:22.491 [2024-05-14 12:02:49.140676] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:22.491 [2024-05-14 12:02:49.140751] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1811951 ] 00:26:22.491 [2024-05-14 12:02:49.262532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:22.491 [2024-05-14 12:02:49.365465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:22.491 [2024-05-14 12:02:49.365472] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:23.059 [2024-05-14 12:02:50.122141] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:23.318 12:02:50 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:23.318 12:02:50 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:26:23.318 12:02:50 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:26:23.318 12:02:50 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:23.318 12:02:50 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:23.886 [2024-05-14 12:02:50.760748] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x271c170 PMD being used: compress_qat 00:26:23.886 12:02:50 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:23.886 12:02:50 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:24.144 12:02:50 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:24.144 [ 00:26:24.144 { 00:26:24.144 "name": "Nvme0n1", 00:26:24.144 "aliases": [ 00:26:24.144 "01000000-0000-0000-5cd2-e43197705251" 00:26:24.144 ], 00:26:24.144 "product_name": "NVMe disk", 00:26:24.144 "block_size": 512, 00:26:24.144 "num_blocks": 15002931888, 00:26:24.144 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:24.144 "assigned_rate_limits": { 00:26:24.144 "rw_ios_per_sec": 0, 00:26:24.144 "rw_mbytes_per_sec": 0, 00:26:24.144 "r_mbytes_per_sec": 0, 00:26:24.144 "w_mbytes_per_sec": 0 00:26:24.144 }, 00:26:24.144 "claimed": false, 00:26:24.144 "zoned": false, 00:26:24.144 "supported_io_types": { 00:26:24.144 "read": true, 00:26:24.144 "write": true, 00:26:24.144 "unmap": true, 00:26:24.144 "write_zeroes": true, 00:26:24.144 "flush": true, 00:26:24.144 "reset": true, 00:26:24.144 "compare": false, 00:26:24.144 "compare_and_write": false, 00:26:24.144 "abort": true, 00:26:24.144 "nvme_admin": true, 00:26:24.144 "nvme_io": true 00:26:24.144 }, 00:26:24.144 "driver_specific": { 00:26:24.144 "nvme": [ 00:26:24.144 { 00:26:24.144 "pci_address": "0000:5e:00.0", 00:26:24.144 "trid": { 00:26:24.144 "trtype": "PCIe", 00:26:24.144 "traddr": "0000:5e:00.0" 00:26:24.144 }, 00:26:24.144 "ctrlr_data": { 00:26:24.144 "cntlid": 0, 00:26:24.144 "vendor_id": "0x8086", 00:26:24.144 "model_number": "INTEL SSDPF2KX076TZO", 00:26:24.144 "serial_number": "PHAC0301002G7P6CGN", 00:26:24.144 "firmware_revision": "JCV10200", 00:26:24.144 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:24.144 "oacs": { 00:26:24.144 "security": 1, 00:26:24.144 "format": 1, 00:26:24.144 "firmware": 1, 00:26:24.144 "ns_manage": 1 00:26:24.144 }, 00:26:24.144 "multi_ctrlr": false, 00:26:24.144 "ana_reporting": false 00:26:24.144 }, 00:26:24.144 "vs": { 00:26:24.144 "nvme_version": "1.3" 00:26:24.144 }, 00:26:24.144 "ns_data": { 00:26:24.144 "id": 1, 00:26:24.144 "can_share": false 00:26:24.144 }, 00:26:24.144 "security": { 00:26:24.144 "opal": true 00:26:24.144 } 00:26:24.144 } 00:26:24.144 ], 00:26:24.144 "mp_policy": "active_passive" 00:26:24.144 } 00:26:24.144 } 00:26:24.144 ] 00:26:24.401 12:02:51 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:24.401 12:02:51 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:24.401 [2024-05-14 12:02:51.474234] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x271cab0 PMD being used: compress_qat 00:26:26.937 6d3e11e8-ab0a-42f9-bc5f-f891a0ef11e3 00:26:26.937 12:02:53 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:26.937 31dc8bdd-8558-45aa-a100-deedac261dce 00:26:26.937 12:02:53 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:26.937 12:02:53 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:27.196 12:02:54 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:27.455 [ 00:26:27.455 { 00:26:27.455 "name": "31dc8bdd-8558-45aa-a100-deedac261dce", 00:26:27.455 "aliases": [ 00:26:27.455 "lvs0/lv0" 00:26:27.455 ], 00:26:27.455 "product_name": "Logical Volume", 00:26:27.455 "block_size": 512, 00:26:27.455 "num_blocks": 204800, 00:26:27.455 "uuid": "31dc8bdd-8558-45aa-a100-deedac261dce", 00:26:27.455 "assigned_rate_limits": { 00:26:27.455 "rw_ios_per_sec": 0, 00:26:27.455 "rw_mbytes_per_sec": 0, 00:26:27.455 "r_mbytes_per_sec": 0, 00:26:27.455 "w_mbytes_per_sec": 0 00:26:27.455 }, 00:26:27.455 "claimed": false, 00:26:27.455 "zoned": false, 00:26:27.455 "supported_io_types": { 00:26:27.455 "read": true, 00:26:27.455 "write": true, 00:26:27.455 "unmap": true, 00:26:27.455 "write_zeroes": true, 00:26:27.455 "flush": false, 00:26:27.455 "reset": true, 00:26:27.455 "compare": false, 00:26:27.455 "compare_and_write": false, 00:26:27.455 "abort": false, 00:26:27.455 "nvme_admin": false, 00:26:27.455 "nvme_io": false 00:26:27.455 }, 00:26:27.455 "driver_specific": { 00:26:27.455 "lvol": { 00:26:27.455 "lvol_store_uuid": "6d3e11e8-ab0a-42f9-bc5f-f891a0ef11e3", 00:26:27.455 "base_bdev": "Nvme0n1", 00:26:27.455 "thin_provision": true, 00:26:27.455 "num_allocated_clusters": 0, 00:26:27.455 "snapshot": false, 00:26:27.455 "clone": false, 00:26:27.455 "esnap_clone": false 00:26:27.455 } 00:26:27.455 } 00:26:27.455 } 00:26:27.455 ] 00:26:27.455 12:02:54 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:27.455 12:02:54 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:26:27.455 12:02:54 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:26:27.715 [2024-05-14 12:02:54.552524] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:27.715 COMP_lvs0/lv0 00:26:27.715 12:02:54 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:27.715 12:02:54 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:27.974 12:02:54 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:27.974 [ 00:26:27.974 { 00:26:27.974 "name": "COMP_lvs0/lv0", 00:26:27.974 "aliases": [ 00:26:27.974 "e3d94047-8cfe-576f-a7bf-3d703417a02e" 00:26:27.974 ], 00:26:27.974 "product_name": "compress", 00:26:27.974 "block_size": 512, 00:26:27.974 "num_blocks": 200704, 00:26:27.974 "uuid": "e3d94047-8cfe-576f-a7bf-3d703417a02e", 00:26:27.974 "assigned_rate_limits": { 00:26:27.974 "rw_ios_per_sec": 0, 00:26:27.974 "rw_mbytes_per_sec": 0, 00:26:27.974 "r_mbytes_per_sec": 0, 00:26:27.974 "w_mbytes_per_sec": 0 00:26:27.974 }, 00:26:27.974 "claimed": false, 00:26:27.974 "zoned": false, 00:26:27.974 "supported_io_types": { 00:26:27.974 "read": true, 00:26:27.974 "write": true, 00:26:27.974 "unmap": false, 00:26:27.974 "write_zeroes": true, 00:26:27.974 "flush": false, 00:26:27.974 "reset": false, 00:26:27.974 "compare": false, 00:26:27.974 "compare_and_write": false, 00:26:27.974 "abort": false, 00:26:27.974 "nvme_admin": false, 00:26:27.974 "nvme_io": false 00:26:27.974 }, 00:26:27.974 "driver_specific": { 00:26:27.974 "compress": { 00:26:27.974 "name": "COMP_lvs0/lv0", 00:26:27.974 "base_bdev_name": "31dc8bdd-8558-45aa-a100-deedac261dce" 00:26:27.974 } 00:26:27.974 } 00:26:27.974 } 00:26:27.974 ] 00:26:28.233 12:02:55 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:28.233 12:02:55 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:28.233 [2024-05-14 12:02:55.195028] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25d01b15a0 PMD being used: compress_qat 00:26:28.233 [2024-05-14 12:02:55.197278] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25fd130 PMD being used: compress_qat 00:26:28.233 Running I/O for 3 seconds... 00:26:31.524 00:26:31.524 Latency(us) 00:26:31.524 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:31.524 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:31.524 Verification LBA range: start 0x0 length 0x3100 00:26:31.524 COMP_lvs0/lv0 : 3.00 5074.53 19.82 0.00 0.00 6253.21 559.19 5841.25 00:26:31.524 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:31.524 Verification LBA range: start 0x3100 length 0x3100 00:26:31.524 COMP_lvs0/lv0 : 3.00 5347.73 20.89 0.00 0.00 5947.05 286.72 5841.25 00:26:31.524 =================================================================================================================== 00:26:31.524 Total : 10422.26 40.71 0.00 0.00 6096.14 286.72 5841.25 00:26:31.524 0 00:26:31.524 12:02:58 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:31.524 12:02:58 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:31.524 12:02:58 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:31.784 12:02:58 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:31.784 12:02:58 compress_compdev -- compress/compress.sh@78 -- # killprocess 1811951 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 1811951 ']' 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 1811951 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1811951 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1811951' 00:26:31.784 killing process with pid 1811951 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@965 -- # kill 1811951 00:26:31.784 Received shutdown signal, test time was about 3.000000 seconds 00:26:31.784 00:26:31.784 Latency(us) 00:26:31.784 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:31.784 =================================================================================================================== 00:26:31.784 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:31.784 12:02:58 compress_compdev -- common/autotest_common.sh@970 -- # wait 1811951 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1813651 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:26:35.083 12:03:01 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1813651 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 1813651 ']' 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:35.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:35.083 12:03:01 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:35.083 [2024-05-14 12:03:01.619628] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:35.083 [2024-05-14 12:03:01.619705] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1813651 ] 00:26:35.083 [2024-05-14 12:03:01.742220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:35.083 [2024-05-14 12:03:01.846137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:35.083 [2024-05-14 12:03:01.846142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:35.651 [2024-05-14 12:03:02.591769] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:35.651 12:03:02 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:35.651 12:03:02 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:26:35.651 12:03:02 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:26:35.651 12:03:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:35.651 12:03:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:36.218 [2024-05-14 12:03:03.240512] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2019170 PMD being used: compress_qat 00:26:36.218 12:03:03 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:36.218 12:03:03 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:36.477 12:03:03 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:36.776 [ 00:26:36.776 { 00:26:36.776 "name": "Nvme0n1", 00:26:36.776 "aliases": [ 00:26:36.776 "01000000-0000-0000-5cd2-e43197705251" 00:26:36.776 ], 00:26:36.776 "product_name": "NVMe disk", 00:26:36.776 "block_size": 512, 00:26:36.776 "num_blocks": 15002931888, 00:26:36.776 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:36.776 "assigned_rate_limits": { 00:26:36.776 "rw_ios_per_sec": 0, 00:26:36.776 "rw_mbytes_per_sec": 0, 00:26:36.776 "r_mbytes_per_sec": 0, 00:26:36.776 "w_mbytes_per_sec": 0 00:26:36.776 }, 00:26:36.776 "claimed": false, 00:26:36.776 "zoned": false, 00:26:36.776 "supported_io_types": { 00:26:36.776 "read": true, 00:26:36.776 "write": true, 00:26:36.776 "unmap": true, 00:26:36.776 "write_zeroes": true, 00:26:36.776 "flush": true, 00:26:36.776 "reset": true, 00:26:36.776 "compare": false, 00:26:36.776 "compare_and_write": false, 00:26:36.776 "abort": true, 00:26:36.776 "nvme_admin": true, 00:26:36.776 "nvme_io": true 00:26:36.776 }, 00:26:36.776 "driver_specific": { 00:26:36.776 "nvme": [ 00:26:36.776 { 00:26:36.776 "pci_address": "0000:5e:00.0", 00:26:36.776 "trid": { 00:26:36.776 "trtype": "PCIe", 00:26:36.776 "traddr": "0000:5e:00.0" 00:26:36.776 }, 00:26:36.776 "ctrlr_data": { 00:26:36.776 "cntlid": 0, 00:26:36.776 "vendor_id": "0x8086", 00:26:36.776 "model_number": "INTEL SSDPF2KX076TZO", 00:26:36.776 "serial_number": "PHAC0301002G7P6CGN", 00:26:36.776 "firmware_revision": "JCV10200", 00:26:36.776 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:36.776 "oacs": { 00:26:36.776 "security": 1, 00:26:36.776 "format": 1, 00:26:36.776 "firmware": 1, 00:26:36.776 "ns_manage": 1 00:26:36.776 }, 00:26:36.776 "multi_ctrlr": false, 00:26:36.776 "ana_reporting": false 00:26:36.776 }, 00:26:36.776 "vs": { 00:26:36.776 "nvme_version": "1.3" 00:26:36.776 }, 00:26:36.776 "ns_data": { 00:26:36.776 "id": 1, 00:26:36.776 "can_share": false 00:26:36.776 }, 00:26:36.776 "security": { 00:26:36.776 "opal": true 00:26:36.776 } 00:26:36.776 } 00:26:36.776 ], 00:26:36.776 "mp_policy": "active_passive" 00:26:36.776 } 00:26:36.776 } 00:26:36.776 ] 00:26:36.776 12:03:03 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:36.776 12:03:03 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:37.062 [2024-05-14 12:03:04.002254] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2019ab0 PMD being used: compress_qat 00:26:39.594 6cb788f1-2632-457a-945a-007d9570038c 00:26:39.594 12:03:06 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:39.594 7fd775e8-20cc-4e13-bd93-3abd78de8098 00:26:39.594 12:03:06 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:39.594 12:03:06 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:39.853 12:03:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:40.113 [ 00:26:40.113 { 00:26:40.113 "name": "7fd775e8-20cc-4e13-bd93-3abd78de8098", 00:26:40.113 "aliases": [ 00:26:40.113 "lvs0/lv0" 00:26:40.113 ], 00:26:40.113 "product_name": "Logical Volume", 00:26:40.113 "block_size": 512, 00:26:40.113 "num_blocks": 204800, 00:26:40.113 "uuid": "7fd775e8-20cc-4e13-bd93-3abd78de8098", 00:26:40.113 "assigned_rate_limits": { 00:26:40.113 "rw_ios_per_sec": 0, 00:26:40.113 "rw_mbytes_per_sec": 0, 00:26:40.113 "r_mbytes_per_sec": 0, 00:26:40.113 "w_mbytes_per_sec": 0 00:26:40.113 }, 00:26:40.113 "claimed": false, 00:26:40.113 "zoned": false, 00:26:40.113 "supported_io_types": { 00:26:40.113 "read": true, 00:26:40.113 "write": true, 00:26:40.113 "unmap": true, 00:26:40.113 "write_zeroes": true, 00:26:40.113 "flush": false, 00:26:40.113 "reset": true, 00:26:40.113 "compare": false, 00:26:40.113 "compare_and_write": false, 00:26:40.113 "abort": false, 00:26:40.113 "nvme_admin": false, 00:26:40.113 "nvme_io": false 00:26:40.113 }, 00:26:40.113 "driver_specific": { 00:26:40.113 "lvol": { 00:26:40.113 "lvol_store_uuid": "6cb788f1-2632-457a-945a-007d9570038c", 00:26:40.113 "base_bdev": "Nvme0n1", 00:26:40.113 "thin_provision": true, 00:26:40.113 "num_allocated_clusters": 0, 00:26:40.113 "snapshot": false, 00:26:40.113 "clone": false, 00:26:40.113 "esnap_clone": false 00:26:40.113 } 00:26:40.113 } 00:26:40.113 } 00:26:40.113 ] 00:26:40.113 12:03:06 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:40.113 12:03:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:26:40.113 12:03:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:26:40.372 [2024-05-14 12:03:07.212907] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:40.372 COMP_lvs0/lv0 00:26:40.372 12:03:07 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:40.372 12:03:07 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:40.631 12:03:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:40.898 [ 00:26:40.898 { 00:26:40.898 "name": "COMP_lvs0/lv0", 00:26:40.898 "aliases": [ 00:26:40.898 "1c4d57fc-e0f5-5579-a1be-dfd78c7988c5" 00:26:40.898 ], 00:26:40.898 "product_name": "compress", 00:26:40.898 "block_size": 4096, 00:26:40.898 "num_blocks": 25088, 00:26:40.898 "uuid": "1c4d57fc-e0f5-5579-a1be-dfd78c7988c5", 00:26:40.898 "assigned_rate_limits": { 00:26:40.898 "rw_ios_per_sec": 0, 00:26:40.898 "rw_mbytes_per_sec": 0, 00:26:40.898 "r_mbytes_per_sec": 0, 00:26:40.898 "w_mbytes_per_sec": 0 00:26:40.898 }, 00:26:40.898 "claimed": false, 00:26:40.898 "zoned": false, 00:26:40.898 "supported_io_types": { 00:26:40.898 "read": true, 00:26:40.898 "write": true, 00:26:40.898 "unmap": false, 00:26:40.898 "write_zeroes": true, 00:26:40.898 "flush": false, 00:26:40.898 "reset": false, 00:26:40.898 "compare": false, 00:26:40.898 "compare_and_write": false, 00:26:40.898 "abort": false, 00:26:40.898 "nvme_admin": false, 00:26:40.898 "nvme_io": false 00:26:40.898 }, 00:26:40.898 "driver_specific": { 00:26:40.898 "compress": { 00:26:40.898 "name": "COMP_lvs0/lv0", 00:26:40.898 "base_bdev_name": "7fd775e8-20cc-4e13-bd93-3abd78de8098" 00:26:40.898 } 00:26:40.898 } 00:26:40.898 } 00:26:40.898 ] 00:26:40.898 12:03:07 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:40.898 12:03:07 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:40.898 [2024-05-14 12:03:07.851160] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f11741b15a0 PMD being used: compress_qat 00:26:40.898 [2024-05-14 12:03:07.853444] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1efa130 PMD being used: compress_qat 00:26:40.898 Running I/O for 3 seconds... 00:26:44.194 00:26:44.194 Latency(us) 00:26:44.194 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:44.194 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:44.194 Verification LBA range: start 0x0 length 0x3100 00:26:44.194 COMP_lvs0/lv0 : 3.00 5032.46 19.66 0.00 0.00 6307.01 491.52 5755.77 00:26:44.194 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:44.194 Verification LBA range: start 0x3100 length 0x3100 00:26:44.194 COMP_lvs0/lv0 : 3.00 5272.19 20.59 0.00 0.00 6032.06 406.04 5926.73 00:26:44.194 =================================================================================================================== 00:26:44.194 Total : 10304.65 40.25 0.00 0.00 6166.34 406.04 5926.73 00:26:44.194 0 00:26:44.194 12:03:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:44.194 12:03:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:44.194 12:03:11 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:44.453 12:03:11 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:44.453 12:03:11 compress_compdev -- compress/compress.sh@78 -- # killprocess 1813651 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 1813651 ']' 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 1813651 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1813651 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1813651' 00:26:44.453 killing process with pid 1813651 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@965 -- # kill 1813651 00:26:44.453 Received shutdown signal, test time was about 3.000000 seconds 00:26:44.453 00:26:44.453 Latency(us) 00:26:44.453 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:44.453 =================================================================================================================== 00:26:44.453 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:44.453 12:03:11 compress_compdev -- common/autotest_common.sh@970 -- # wait 1813651 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1815698 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:26:47.755 12:03:14 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1815698 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 1815698 ']' 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:47.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:47.755 12:03:14 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:47.755 [2024-05-14 12:03:14.263798] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:47.755 [2024-05-14 12:03:14.263863] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1815698 ] 00:26:47.755 [2024-05-14 12:03:14.393678] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:47.755 [2024-05-14 12:03:14.501721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:47.755 [2024-05-14 12:03:14.501804] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:47.755 [2024-05-14 12:03:14.501810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:48.323 [2024-05-14 12:03:15.259490] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:48.323 12:03:15 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:48.323 12:03:15 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:26:48.323 12:03:15 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:26:48.323 12:03:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:48.323 12:03:15 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:48.891 [2024-05-14 12:03:15.905062] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfb9bb0 PMD being used: compress_qat 00:26:48.891 12:03:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:48.891 12:03:15 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:49.149 12:03:16 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:49.408 [ 00:26:49.409 { 00:26:49.409 "name": "Nvme0n1", 00:26:49.409 "aliases": [ 00:26:49.409 "01000000-0000-0000-5cd2-e43197705251" 00:26:49.409 ], 00:26:49.409 "product_name": "NVMe disk", 00:26:49.409 "block_size": 512, 00:26:49.409 "num_blocks": 15002931888, 00:26:49.409 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:49.409 "assigned_rate_limits": { 00:26:49.409 "rw_ios_per_sec": 0, 00:26:49.409 "rw_mbytes_per_sec": 0, 00:26:49.409 "r_mbytes_per_sec": 0, 00:26:49.409 "w_mbytes_per_sec": 0 00:26:49.409 }, 00:26:49.409 "claimed": false, 00:26:49.409 "zoned": false, 00:26:49.409 "supported_io_types": { 00:26:49.409 "read": true, 00:26:49.409 "write": true, 00:26:49.409 "unmap": true, 00:26:49.409 "write_zeroes": true, 00:26:49.409 "flush": true, 00:26:49.409 "reset": true, 00:26:49.409 "compare": false, 00:26:49.409 "compare_and_write": false, 00:26:49.409 "abort": true, 00:26:49.409 "nvme_admin": true, 00:26:49.409 "nvme_io": true 00:26:49.409 }, 00:26:49.409 "driver_specific": { 00:26:49.409 "nvme": [ 00:26:49.409 { 00:26:49.409 "pci_address": "0000:5e:00.0", 00:26:49.409 "trid": { 00:26:49.409 "trtype": "PCIe", 00:26:49.409 "traddr": "0000:5e:00.0" 00:26:49.409 }, 00:26:49.409 "ctrlr_data": { 00:26:49.409 "cntlid": 0, 00:26:49.409 "vendor_id": "0x8086", 00:26:49.409 "model_number": "INTEL SSDPF2KX076TZO", 00:26:49.409 "serial_number": "PHAC0301002G7P6CGN", 00:26:49.409 "firmware_revision": "JCV10200", 00:26:49.409 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:49.409 "oacs": { 00:26:49.409 "security": 1, 00:26:49.409 "format": 1, 00:26:49.409 "firmware": 1, 00:26:49.409 "ns_manage": 1 00:26:49.409 }, 00:26:49.409 "multi_ctrlr": false, 00:26:49.409 "ana_reporting": false 00:26:49.409 }, 00:26:49.409 "vs": { 00:26:49.409 "nvme_version": "1.3" 00:26:49.409 }, 00:26:49.409 "ns_data": { 00:26:49.409 "id": 1, 00:26:49.409 "can_share": false 00:26:49.409 }, 00:26:49.409 "security": { 00:26:49.409 "opal": true 00:26:49.409 } 00:26:49.409 } 00:26:49.409 ], 00:26:49.409 "mp_policy": "active_passive" 00:26:49.409 } 00:26:49.409 } 00:26:49.409 ] 00:26:49.409 12:03:16 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:49.409 12:03:16 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:49.668 [2024-05-14 12:03:16.518442] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfba4f0 PMD being used: compress_qat 00:26:52.202 8e35f5a2-854a-4680-ba21-483457509379 00:26:52.202 12:03:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:52.202 74c9da67-3e02-4ee3-9c69-a65cfd3488a0 00:26:52.202 12:03:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:52.202 12:03:18 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:52.202 12:03:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:52.461 [ 00:26:52.461 { 00:26:52.461 "name": "74c9da67-3e02-4ee3-9c69-a65cfd3488a0", 00:26:52.461 "aliases": [ 00:26:52.461 "lvs0/lv0" 00:26:52.461 ], 00:26:52.461 "product_name": "Logical Volume", 00:26:52.461 "block_size": 512, 00:26:52.461 "num_blocks": 204800, 00:26:52.461 "uuid": "74c9da67-3e02-4ee3-9c69-a65cfd3488a0", 00:26:52.461 "assigned_rate_limits": { 00:26:52.461 "rw_ios_per_sec": 0, 00:26:52.461 "rw_mbytes_per_sec": 0, 00:26:52.461 "r_mbytes_per_sec": 0, 00:26:52.461 "w_mbytes_per_sec": 0 00:26:52.461 }, 00:26:52.461 "claimed": false, 00:26:52.461 "zoned": false, 00:26:52.461 "supported_io_types": { 00:26:52.461 "read": true, 00:26:52.461 "write": true, 00:26:52.461 "unmap": true, 00:26:52.461 "write_zeroes": true, 00:26:52.461 "flush": false, 00:26:52.461 "reset": true, 00:26:52.461 "compare": false, 00:26:52.461 "compare_and_write": false, 00:26:52.461 "abort": false, 00:26:52.461 "nvme_admin": false, 00:26:52.461 "nvme_io": false 00:26:52.461 }, 00:26:52.461 "driver_specific": { 00:26:52.461 "lvol": { 00:26:52.461 "lvol_store_uuid": "8e35f5a2-854a-4680-ba21-483457509379", 00:26:52.461 "base_bdev": "Nvme0n1", 00:26:52.461 "thin_provision": true, 00:26:52.461 "num_allocated_clusters": 0, 00:26:52.461 "snapshot": false, 00:26:52.461 "clone": false, 00:26:52.461 "esnap_clone": false 00:26:52.461 } 00:26:52.461 } 00:26:52.461 } 00:26:52.461 ] 00:26:52.461 12:03:19 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:52.461 12:03:19 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:52.461 12:03:19 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:52.721 [2024-05-14 12:03:19.714001] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:52.721 COMP_lvs0/lv0 00:26:52.721 12:03:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:52.721 12:03:19 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:52.980 12:03:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:53.240 [ 00:26:53.240 { 00:26:53.240 "name": "COMP_lvs0/lv0", 00:26:53.240 "aliases": [ 00:26:53.240 "8e923d72-41f0-535c-bc7a-a5f54fb9fdb2" 00:26:53.240 ], 00:26:53.240 "product_name": "compress", 00:26:53.240 "block_size": 512, 00:26:53.240 "num_blocks": 200704, 00:26:53.240 "uuid": "8e923d72-41f0-535c-bc7a-a5f54fb9fdb2", 00:26:53.240 "assigned_rate_limits": { 00:26:53.240 "rw_ios_per_sec": 0, 00:26:53.240 "rw_mbytes_per_sec": 0, 00:26:53.240 "r_mbytes_per_sec": 0, 00:26:53.240 "w_mbytes_per_sec": 0 00:26:53.240 }, 00:26:53.240 "claimed": false, 00:26:53.240 "zoned": false, 00:26:53.240 "supported_io_types": { 00:26:53.240 "read": true, 00:26:53.240 "write": true, 00:26:53.240 "unmap": false, 00:26:53.240 "write_zeroes": true, 00:26:53.240 "flush": false, 00:26:53.240 "reset": false, 00:26:53.240 "compare": false, 00:26:53.240 "compare_and_write": false, 00:26:53.240 "abort": false, 00:26:53.240 "nvme_admin": false, 00:26:53.240 "nvme_io": false 00:26:53.240 }, 00:26:53.240 "driver_specific": { 00:26:53.240 "compress": { 00:26:53.240 "name": "COMP_lvs0/lv0", 00:26:53.240 "base_bdev_name": "74c9da67-3e02-4ee3-9c69-a65cfd3488a0" 00:26:53.240 } 00:26:53.240 } 00:26:53.240 } 00:26:53.240 ] 00:26:53.240 12:03:20 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:26:53.240 12:03:20 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:53.500 [2024-05-14 12:03:20.363041] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f69b01b1330 PMD being used: compress_qat 00:26:53.500 I/O targets: 00:26:53.500 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:26:53.500 00:26:53.500 00:26:53.500 CUnit - A unit testing framework for C - Version 2.1-3 00:26:53.500 http://cunit.sourceforge.net/ 00:26:53.500 00:26:53.500 00:26:53.500 Suite: bdevio tests on: COMP_lvs0/lv0 00:26:53.500 Test: blockdev write read block ...passed 00:26:53.500 Test: blockdev write zeroes read block ...passed 00:26:53.500 Test: blockdev write zeroes read no split ...passed 00:26:53.500 Test: blockdev write zeroes read split ...passed 00:26:53.500 Test: blockdev write zeroes read split partial ...passed 00:26:53.500 Test: blockdev reset ...[2024-05-14 12:03:20.401998] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:26:53.500 passed 00:26:53.500 Test: blockdev write read 8 blocks ...passed 00:26:53.500 Test: blockdev write read size > 128k ...passed 00:26:53.500 Test: blockdev write read invalid size ...passed 00:26:53.500 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:53.500 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:53.500 Test: blockdev write read max offset ...passed 00:26:53.500 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:53.500 Test: blockdev writev readv 8 blocks ...passed 00:26:53.500 Test: blockdev writev readv 30 x 1block ...passed 00:26:53.500 Test: blockdev writev readv block ...passed 00:26:53.500 Test: blockdev writev readv size > 128k ...passed 00:26:53.500 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:53.500 Test: blockdev comparev and writev ...passed 00:26:53.500 Test: blockdev nvme passthru rw ...passed 00:26:53.500 Test: blockdev nvme passthru vendor specific ...passed 00:26:53.500 Test: blockdev nvme admin passthru ...passed 00:26:53.500 Test: blockdev copy ...passed 00:26:53.500 00:26:53.500 Run Summary: Type Total Ran Passed Failed Inactive 00:26:53.500 suites 1 1 n/a 0 0 00:26:53.500 tests 23 23 23 0 0 00:26:53.500 asserts 130 130 130 0 n/a 00:26:53.500 00:26:53.500 Elapsed time = 0.096 seconds 00:26:53.500 0 00:26:53.500 12:03:20 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:26:53.500 12:03:20 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:53.759 12:03:20 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:54.019 12:03:20 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:26:54.019 12:03:20 compress_compdev -- compress/compress.sh@62 -- # killprocess 1815698 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 1815698 ']' 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 1815698 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1815698 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1815698' 00:26:54.019 killing process with pid 1815698 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@965 -- # kill 1815698 00:26:54.019 12:03:20 compress_compdev -- common/autotest_common.sh@970 -- # wait 1815698 00:26:57.328 12:03:24 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:26:57.328 12:03:24 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:26:57.328 00:26:57.328 real 0m47.766s 00:26:57.328 user 1m50.230s 00:26:57.328 sys 0m5.853s 00:26:57.328 12:03:24 compress_compdev -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:57.328 12:03:24 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:57.328 ************************************ 00:26:57.328 END TEST compress_compdev 00:26:57.328 ************************************ 00:26:57.328 12:03:24 -- spdk/autotest.sh@345 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:57.328 12:03:24 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:57.328 12:03:24 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:57.328 12:03:24 -- common/autotest_common.sh@10 -- # set +x 00:26:57.328 ************************************ 00:26:57.328 START TEST compress_isal 00:26:57.328 ************************************ 00:26:57.328 12:03:24 compress_isal -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:57.328 * Looking for test storage... 00:26:57.328 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:57.328 12:03:24 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:57.328 12:03:24 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:57.328 12:03:24 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:57.329 12:03:24 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:57.329 12:03:24 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:57.329 12:03:24 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:57.329 12:03:24 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:57.329 12:03:24 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:57.329 12:03:24 compress_isal -- paths/export.sh@5 -- # export PATH 00:26:57.329 12:03:24 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@47 -- # : 0 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:57.329 12:03:24 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1817117 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1817117 00:26:57.329 12:03:24 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 1817117 ']' 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:57.329 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:57.329 12:03:24 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:57.329 [2024-05-14 12:03:24.304182] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:26:57.329 [2024-05-14 12:03:24.304251] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1817117 ] 00:26:57.588 [2024-05-14 12:03:24.424566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:57.588 [2024-05-14 12:03:24.527602] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:57.588 [2024-05-14 12:03:24.527608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:58.156 12:03:25 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:58.156 12:03:25 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:26:58.156 12:03:25 compress_isal -- compress/compress.sh@74 -- # create_vols 00:26:58.156 12:03:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:58.156 12:03:25 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:58.725 12:03:25 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:58.725 12:03:25 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:58.984 12:03:26 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:59.243 [ 00:26:59.243 { 00:26:59.243 "name": "Nvme0n1", 00:26:59.243 "aliases": [ 00:26:59.243 "01000000-0000-0000-5cd2-e43197705251" 00:26:59.243 ], 00:26:59.243 "product_name": "NVMe disk", 00:26:59.243 "block_size": 512, 00:26:59.243 "num_blocks": 15002931888, 00:26:59.243 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:26:59.244 "assigned_rate_limits": { 00:26:59.244 "rw_ios_per_sec": 0, 00:26:59.244 "rw_mbytes_per_sec": 0, 00:26:59.244 "r_mbytes_per_sec": 0, 00:26:59.244 "w_mbytes_per_sec": 0 00:26:59.244 }, 00:26:59.244 "claimed": false, 00:26:59.244 "zoned": false, 00:26:59.244 "supported_io_types": { 00:26:59.244 "read": true, 00:26:59.244 "write": true, 00:26:59.244 "unmap": true, 00:26:59.244 "write_zeroes": true, 00:26:59.244 "flush": true, 00:26:59.244 "reset": true, 00:26:59.244 "compare": false, 00:26:59.244 "compare_and_write": false, 00:26:59.244 "abort": true, 00:26:59.244 "nvme_admin": true, 00:26:59.244 "nvme_io": true 00:26:59.244 }, 00:26:59.244 "driver_specific": { 00:26:59.244 "nvme": [ 00:26:59.244 { 00:26:59.244 "pci_address": "0000:5e:00.0", 00:26:59.244 "trid": { 00:26:59.244 "trtype": "PCIe", 00:26:59.244 "traddr": "0000:5e:00.0" 00:26:59.244 }, 00:26:59.244 "ctrlr_data": { 00:26:59.244 "cntlid": 0, 00:26:59.244 "vendor_id": "0x8086", 00:26:59.244 "model_number": "INTEL SSDPF2KX076TZO", 00:26:59.244 "serial_number": "PHAC0301002G7P6CGN", 00:26:59.244 "firmware_revision": "JCV10200", 00:26:59.244 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:26:59.244 "oacs": { 00:26:59.244 "security": 1, 00:26:59.244 "format": 1, 00:26:59.244 "firmware": 1, 00:26:59.244 "ns_manage": 1 00:26:59.244 }, 00:26:59.244 "multi_ctrlr": false, 00:26:59.244 "ana_reporting": false 00:26:59.244 }, 00:26:59.244 "vs": { 00:26:59.244 "nvme_version": "1.3" 00:26:59.244 }, 00:26:59.244 "ns_data": { 00:26:59.244 "id": 1, 00:26:59.244 "can_share": false 00:26:59.244 }, 00:26:59.244 "security": { 00:26:59.244 "opal": true 00:26:59.244 } 00:26:59.244 } 00:26:59.244 ], 00:26:59.244 "mp_policy": "active_passive" 00:26:59.244 } 00:26:59.244 } 00:26:59.244 ] 00:26:59.503 12:03:26 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:59.503 12:03:26 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:02.039 2a0c1692-29c8-4683-bf0c-5991e12d30b9 00:27:02.039 12:03:28 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:02.039 86991fb6-38d6-41bb-b0ba-0348cda473a5 00:27:02.039 12:03:29 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:02.039 12:03:29 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:02.298 12:03:29 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:02.558 [ 00:27:02.558 { 00:27:02.558 "name": "86991fb6-38d6-41bb-b0ba-0348cda473a5", 00:27:02.558 "aliases": [ 00:27:02.558 "lvs0/lv0" 00:27:02.558 ], 00:27:02.558 "product_name": "Logical Volume", 00:27:02.558 "block_size": 512, 00:27:02.558 "num_blocks": 204800, 00:27:02.558 "uuid": "86991fb6-38d6-41bb-b0ba-0348cda473a5", 00:27:02.558 "assigned_rate_limits": { 00:27:02.558 "rw_ios_per_sec": 0, 00:27:02.558 "rw_mbytes_per_sec": 0, 00:27:02.558 "r_mbytes_per_sec": 0, 00:27:02.558 "w_mbytes_per_sec": 0 00:27:02.558 }, 00:27:02.558 "claimed": false, 00:27:02.558 "zoned": false, 00:27:02.558 "supported_io_types": { 00:27:02.558 "read": true, 00:27:02.558 "write": true, 00:27:02.558 "unmap": true, 00:27:02.558 "write_zeroes": true, 00:27:02.558 "flush": false, 00:27:02.558 "reset": true, 00:27:02.558 "compare": false, 00:27:02.558 "compare_and_write": false, 00:27:02.558 "abort": false, 00:27:02.558 "nvme_admin": false, 00:27:02.558 "nvme_io": false 00:27:02.558 }, 00:27:02.558 "driver_specific": { 00:27:02.558 "lvol": { 00:27:02.558 "lvol_store_uuid": "2a0c1692-29c8-4683-bf0c-5991e12d30b9", 00:27:02.558 "base_bdev": "Nvme0n1", 00:27:02.558 "thin_provision": true, 00:27:02.558 "num_allocated_clusters": 0, 00:27:02.558 "snapshot": false, 00:27:02.558 "clone": false, 00:27:02.558 "esnap_clone": false 00:27:02.558 } 00:27:02.558 } 00:27:02.558 } 00:27:02.558 ] 00:27:02.558 12:03:29 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:02.558 12:03:29 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:02.558 12:03:29 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:02.817 [2024-05-14 12:03:29.737425] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:02.817 COMP_lvs0/lv0 00:27:02.817 12:03:29 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:02.817 12:03:29 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:03.077 12:03:29 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:03.336 [ 00:27:03.336 { 00:27:03.336 "name": "COMP_lvs0/lv0", 00:27:03.336 "aliases": [ 00:27:03.336 "e052bd02-a79f-5e90-a37c-e94272569c14" 00:27:03.336 ], 00:27:03.336 "product_name": "compress", 00:27:03.336 "block_size": 512, 00:27:03.336 "num_blocks": 200704, 00:27:03.336 "uuid": "e052bd02-a79f-5e90-a37c-e94272569c14", 00:27:03.336 "assigned_rate_limits": { 00:27:03.336 "rw_ios_per_sec": 0, 00:27:03.336 "rw_mbytes_per_sec": 0, 00:27:03.336 "r_mbytes_per_sec": 0, 00:27:03.336 "w_mbytes_per_sec": 0 00:27:03.336 }, 00:27:03.336 "claimed": false, 00:27:03.336 "zoned": false, 00:27:03.336 "supported_io_types": { 00:27:03.336 "read": true, 00:27:03.336 "write": true, 00:27:03.336 "unmap": false, 00:27:03.336 "write_zeroes": true, 00:27:03.336 "flush": false, 00:27:03.336 "reset": false, 00:27:03.336 "compare": false, 00:27:03.336 "compare_and_write": false, 00:27:03.336 "abort": false, 00:27:03.336 "nvme_admin": false, 00:27:03.336 "nvme_io": false 00:27:03.336 }, 00:27:03.336 "driver_specific": { 00:27:03.336 "compress": { 00:27:03.336 "name": "COMP_lvs0/lv0", 00:27:03.336 "base_bdev_name": "86991fb6-38d6-41bb-b0ba-0348cda473a5" 00:27:03.336 } 00:27:03.336 } 00:27:03.336 } 00:27:03.336 ] 00:27:03.336 12:03:30 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:03.336 12:03:30 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:03.336 Running I/O for 3 seconds... 00:27:06.637 00:27:06.637 Latency(us) 00:27:06.637 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:06.637 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:06.637 Verification LBA range: start 0x0 length 0x3100 00:27:06.637 COMP_lvs0/lv0 : 3.01 2880.30 11.25 0.00 0.00 11062.58 673.17 9744.92 00:27:06.637 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:06.637 Verification LBA range: start 0x3100 length 0x3100 00:27:06.637 COMP_lvs0/lv0 : 3.01 2877.69 11.24 0.00 0.00 11080.10 1061.40 9573.95 00:27:06.637 =================================================================================================================== 00:27:06.637 Total : 5757.99 22.49 0.00 0.00 11071.34 673.17 9744.92 00:27:06.637 0 00:27:06.637 12:03:33 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:06.637 12:03:33 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:06.637 12:03:33 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:06.895 12:03:33 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:06.895 12:03:33 compress_isal -- compress/compress.sh@78 -- # killprocess 1817117 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 1817117 ']' 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@950 -- # kill -0 1817117 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1817117 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:06.895 12:03:33 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1817117' 00:27:06.895 killing process with pid 1817117 00:27:06.896 12:03:33 compress_isal -- common/autotest_common.sh@965 -- # kill 1817117 00:27:06.896 Received shutdown signal, test time was about 3.000000 seconds 00:27:06.896 00:27:06.896 Latency(us) 00:27:06.896 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:06.896 =================================================================================================================== 00:27:06.896 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:06.896 12:03:33 compress_isal -- common/autotest_common.sh@970 -- # wait 1817117 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1818720 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:10.180 12:03:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1818720 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 1818720 ']' 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:10.180 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:10.180 12:03:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:10.180 [2024-05-14 12:03:36.959805] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:10.180 [2024-05-14 12:03:36.959882] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1818720 ] 00:27:10.180 [2024-05-14 12:03:37.084406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:10.180 [2024-05-14 12:03:37.188639] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:10.180 [2024-05-14 12:03:37.188646] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:11.115 12:03:37 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:11.115 12:03:37 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:27:11.115 12:03:37 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:27:11.116 12:03:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:11.116 12:03:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:11.684 12:03:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:11.684 12:03:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:11.943 [ 00:27:11.943 { 00:27:11.943 "name": "Nvme0n1", 00:27:11.943 "aliases": [ 00:27:11.943 "01000000-0000-0000-5cd2-e43197705251" 00:27:11.943 ], 00:27:11.943 "product_name": "NVMe disk", 00:27:11.943 "block_size": 512, 00:27:11.943 "num_blocks": 15002931888, 00:27:11.943 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:11.943 "assigned_rate_limits": { 00:27:11.943 "rw_ios_per_sec": 0, 00:27:11.943 "rw_mbytes_per_sec": 0, 00:27:11.943 "r_mbytes_per_sec": 0, 00:27:11.943 "w_mbytes_per_sec": 0 00:27:11.943 }, 00:27:11.943 "claimed": false, 00:27:11.943 "zoned": false, 00:27:11.943 "supported_io_types": { 00:27:11.943 "read": true, 00:27:11.943 "write": true, 00:27:11.943 "unmap": true, 00:27:11.943 "write_zeroes": true, 00:27:11.943 "flush": true, 00:27:11.943 "reset": true, 00:27:11.943 "compare": false, 00:27:11.943 "compare_and_write": false, 00:27:11.943 "abort": true, 00:27:11.943 "nvme_admin": true, 00:27:11.943 "nvme_io": true 00:27:11.943 }, 00:27:11.943 "driver_specific": { 00:27:11.943 "nvme": [ 00:27:11.943 { 00:27:11.943 "pci_address": "0000:5e:00.0", 00:27:11.943 "trid": { 00:27:11.943 "trtype": "PCIe", 00:27:11.943 "traddr": "0000:5e:00.0" 00:27:11.943 }, 00:27:11.943 "ctrlr_data": { 00:27:11.943 "cntlid": 0, 00:27:11.943 "vendor_id": "0x8086", 00:27:11.943 "model_number": "INTEL SSDPF2KX076TZO", 00:27:11.943 "serial_number": "PHAC0301002G7P6CGN", 00:27:11.943 "firmware_revision": "JCV10200", 00:27:11.943 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:11.943 "oacs": { 00:27:11.943 "security": 1, 00:27:11.943 "format": 1, 00:27:11.943 "firmware": 1, 00:27:11.943 "ns_manage": 1 00:27:11.943 }, 00:27:11.943 "multi_ctrlr": false, 00:27:11.943 "ana_reporting": false 00:27:11.943 }, 00:27:11.943 "vs": { 00:27:11.943 "nvme_version": "1.3" 00:27:11.943 }, 00:27:11.943 "ns_data": { 00:27:11.943 "id": 1, 00:27:11.943 "can_share": false 00:27:11.943 }, 00:27:11.943 "security": { 00:27:11.943 "opal": true 00:27:11.943 } 00:27:11.943 } 00:27:11.943 ], 00:27:11.943 "mp_policy": "active_passive" 00:27:11.943 } 00:27:11.943 } 00:27:11.943 ] 00:27:11.943 12:03:39 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:11.943 12:03:39 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:14.478 e4665267-9a1f-4ffc-a350-d73f40c52e9e 00:27:14.478 12:03:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:14.737 7ba24bd2-74dc-4b99-acdb-468743ddedff 00:27:14.737 12:03:41 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:14.737 12:03:41 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:14.996 12:03:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:15.255 [ 00:27:15.255 { 00:27:15.255 "name": "7ba24bd2-74dc-4b99-acdb-468743ddedff", 00:27:15.255 "aliases": [ 00:27:15.255 "lvs0/lv0" 00:27:15.255 ], 00:27:15.255 "product_name": "Logical Volume", 00:27:15.255 "block_size": 512, 00:27:15.255 "num_blocks": 204800, 00:27:15.255 "uuid": "7ba24bd2-74dc-4b99-acdb-468743ddedff", 00:27:15.255 "assigned_rate_limits": { 00:27:15.255 "rw_ios_per_sec": 0, 00:27:15.255 "rw_mbytes_per_sec": 0, 00:27:15.255 "r_mbytes_per_sec": 0, 00:27:15.255 "w_mbytes_per_sec": 0 00:27:15.255 }, 00:27:15.255 "claimed": false, 00:27:15.255 "zoned": false, 00:27:15.255 "supported_io_types": { 00:27:15.255 "read": true, 00:27:15.255 "write": true, 00:27:15.255 "unmap": true, 00:27:15.255 "write_zeroes": true, 00:27:15.255 "flush": false, 00:27:15.255 "reset": true, 00:27:15.255 "compare": false, 00:27:15.255 "compare_and_write": false, 00:27:15.255 "abort": false, 00:27:15.255 "nvme_admin": false, 00:27:15.255 "nvme_io": false 00:27:15.255 }, 00:27:15.255 "driver_specific": { 00:27:15.255 "lvol": { 00:27:15.255 "lvol_store_uuid": "e4665267-9a1f-4ffc-a350-d73f40c52e9e", 00:27:15.255 "base_bdev": "Nvme0n1", 00:27:15.255 "thin_provision": true, 00:27:15.255 "num_allocated_clusters": 0, 00:27:15.255 "snapshot": false, 00:27:15.255 "clone": false, 00:27:15.255 "esnap_clone": false 00:27:15.255 } 00:27:15.255 } 00:27:15.255 } 00:27:15.255 ] 00:27:15.255 12:03:42 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:15.255 12:03:42 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:15.255 12:03:42 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:15.513 [2024-05-14 12:03:42.432793] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:15.513 COMP_lvs0/lv0 00:27:15.513 12:03:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:15.513 12:03:42 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:15.772 12:03:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:16.032 [ 00:27:16.032 { 00:27:16.032 "name": "COMP_lvs0/lv0", 00:27:16.032 "aliases": [ 00:27:16.032 "3dd4a5f9-41dc-5451-b9b1-d6d7fc33ba9a" 00:27:16.032 ], 00:27:16.032 "product_name": "compress", 00:27:16.032 "block_size": 512, 00:27:16.032 "num_blocks": 200704, 00:27:16.032 "uuid": "3dd4a5f9-41dc-5451-b9b1-d6d7fc33ba9a", 00:27:16.032 "assigned_rate_limits": { 00:27:16.032 "rw_ios_per_sec": 0, 00:27:16.032 "rw_mbytes_per_sec": 0, 00:27:16.032 "r_mbytes_per_sec": 0, 00:27:16.032 "w_mbytes_per_sec": 0 00:27:16.032 }, 00:27:16.032 "claimed": false, 00:27:16.032 "zoned": false, 00:27:16.032 "supported_io_types": { 00:27:16.032 "read": true, 00:27:16.032 "write": true, 00:27:16.032 "unmap": false, 00:27:16.032 "write_zeroes": true, 00:27:16.032 "flush": false, 00:27:16.032 "reset": false, 00:27:16.032 "compare": false, 00:27:16.032 "compare_and_write": false, 00:27:16.032 "abort": false, 00:27:16.032 "nvme_admin": false, 00:27:16.032 "nvme_io": false 00:27:16.032 }, 00:27:16.032 "driver_specific": { 00:27:16.032 "compress": { 00:27:16.032 "name": "COMP_lvs0/lv0", 00:27:16.032 "base_bdev_name": "7ba24bd2-74dc-4b99-acdb-468743ddedff" 00:27:16.032 } 00:27:16.032 } 00:27:16.032 } 00:27:16.032 ] 00:27:16.032 12:03:42 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:16.032 12:03:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:16.032 Running I/O for 3 seconds... 00:27:19.366 00:27:19.366 Latency(us) 00:27:19.366 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:19.366 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:19.366 Verification LBA range: start 0x0 length 0x3100 00:27:19.367 COMP_lvs0/lv0 : 3.00 3863.39 15.09 0.00 0.00 8228.20 616.18 7921.31 00:27:19.367 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:19.367 Verification LBA range: start 0x3100 length 0x3100 00:27:19.367 COMP_lvs0/lv0 : 3.00 3867.39 15.11 0.00 0.00 8231.09 569.88 8035.28 00:27:19.367 =================================================================================================================== 00:27:19.367 Total : 7730.78 30.20 0.00 0.00 8229.65 569.88 8035.28 00:27:19.367 0 00:27:19.367 12:03:46 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:19.367 12:03:46 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:19.367 12:03:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:19.626 12:03:46 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:19.626 12:03:46 compress_isal -- compress/compress.sh@78 -- # killprocess 1818720 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 1818720 ']' 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@950 -- # kill -0 1818720 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1818720 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1818720' 00:27:19.626 killing process with pid 1818720 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@965 -- # kill 1818720 00:27:19.626 Received shutdown signal, test time was about 3.000000 seconds 00:27:19.626 00:27:19.626 Latency(us) 00:27:19.626 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:19.626 =================================================================================================================== 00:27:19.626 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:19.626 12:03:46 compress_isal -- common/autotest_common.sh@970 -- # wait 1818720 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1820330 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:22.916 12:03:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1820330 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 1820330 ']' 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:22.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:22.916 12:03:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:22.916 [2024-05-14 12:03:49.705850] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:22.916 [2024-05-14 12:03:49.705925] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1820330 ] 00:27:22.916 [2024-05-14 12:03:49.827885] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:22.916 [2024-05-14 12:03:49.933044] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:22.916 [2024-05-14 12:03:49.933050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:23.851 12:03:50 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:23.851 12:03:50 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:27:23.851 12:03:50 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:27:23.851 12:03:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:23.851 12:03:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:24.418 12:03:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:24.419 12:03:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:24.678 [ 00:27:24.678 { 00:27:24.678 "name": "Nvme0n1", 00:27:24.678 "aliases": [ 00:27:24.678 "01000000-0000-0000-5cd2-e43197705251" 00:27:24.678 ], 00:27:24.678 "product_name": "NVMe disk", 00:27:24.678 "block_size": 512, 00:27:24.678 "num_blocks": 15002931888, 00:27:24.678 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:24.678 "assigned_rate_limits": { 00:27:24.678 "rw_ios_per_sec": 0, 00:27:24.678 "rw_mbytes_per_sec": 0, 00:27:24.678 "r_mbytes_per_sec": 0, 00:27:24.678 "w_mbytes_per_sec": 0 00:27:24.678 }, 00:27:24.678 "claimed": false, 00:27:24.678 "zoned": false, 00:27:24.678 "supported_io_types": { 00:27:24.678 "read": true, 00:27:24.678 "write": true, 00:27:24.678 "unmap": true, 00:27:24.678 "write_zeroes": true, 00:27:24.678 "flush": true, 00:27:24.678 "reset": true, 00:27:24.678 "compare": false, 00:27:24.678 "compare_and_write": false, 00:27:24.678 "abort": true, 00:27:24.678 "nvme_admin": true, 00:27:24.678 "nvme_io": true 00:27:24.678 }, 00:27:24.678 "driver_specific": { 00:27:24.678 "nvme": [ 00:27:24.678 { 00:27:24.678 "pci_address": "0000:5e:00.0", 00:27:24.678 "trid": { 00:27:24.678 "trtype": "PCIe", 00:27:24.678 "traddr": "0000:5e:00.0" 00:27:24.678 }, 00:27:24.678 "ctrlr_data": { 00:27:24.678 "cntlid": 0, 00:27:24.678 "vendor_id": "0x8086", 00:27:24.678 "model_number": "INTEL SSDPF2KX076TZO", 00:27:24.678 "serial_number": "PHAC0301002G7P6CGN", 00:27:24.678 "firmware_revision": "JCV10200", 00:27:24.678 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:24.678 "oacs": { 00:27:24.678 "security": 1, 00:27:24.678 "format": 1, 00:27:24.678 "firmware": 1, 00:27:24.678 "ns_manage": 1 00:27:24.678 }, 00:27:24.678 "multi_ctrlr": false, 00:27:24.678 "ana_reporting": false 00:27:24.678 }, 00:27:24.678 "vs": { 00:27:24.678 "nvme_version": "1.3" 00:27:24.678 }, 00:27:24.678 "ns_data": { 00:27:24.678 "id": 1, 00:27:24.678 "can_share": false 00:27:24.678 }, 00:27:24.678 "security": { 00:27:24.678 "opal": true 00:27:24.678 } 00:27:24.678 } 00:27:24.678 ], 00:27:24.678 "mp_policy": "active_passive" 00:27:24.678 } 00:27:24.678 } 00:27:24.678 ] 00:27:24.678 12:03:51 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:24.678 12:03:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:27.213 2d0eaa2f-c4c4-4aed-a94b-53f0db696a9c 00:27:27.213 12:03:54 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:27.472 1b428245-a7e6-4589-8c7f-105ab766ca0f 00:27:27.472 12:03:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:27.472 12:03:54 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:27.731 12:03:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:27.991 [ 00:27:27.991 { 00:27:27.991 "name": "1b428245-a7e6-4589-8c7f-105ab766ca0f", 00:27:27.991 "aliases": [ 00:27:27.991 "lvs0/lv0" 00:27:27.991 ], 00:27:27.991 "product_name": "Logical Volume", 00:27:27.991 "block_size": 512, 00:27:27.991 "num_blocks": 204800, 00:27:27.991 "uuid": "1b428245-a7e6-4589-8c7f-105ab766ca0f", 00:27:27.991 "assigned_rate_limits": { 00:27:27.991 "rw_ios_per_sec": 0, 00:27:27.991 "rw_mbytes_per_sec": 0, 00:27:27.991 "r_mbytes_per_sec": 0, 00:27:27.991 "w_mbytes_per_sec": 0 00:27:27.991 }, 00:27:27.991 "claimed": false, 00:27:27.991 "zoned": false, 00:27:27.991 "supported_io_types": { 00:27:27.991 "read": true, 00:27:27.991 "write": true, 00:27:27.991 "unmap": true, 00:27:27.991 "write_zeroes": true, 00:27:27.991 "flush": false, 00:27:27.991 "reset": true, 00:27:27.991 "compare": false, 00:27:27.991 "compare_and_write": false, 00:27:27.991 "abort": false, 00:27:27.991 "nvme_admin": false, 00:27:27.991 "nvme_io": false 00:27:27.991 }, 00:27:27.991 "driver_specific": { 00:27:27.991 "lvol": { 00:27:27.991 "lvol_store_uuid": "2d0eaa2f-c4c4-4aed-a94b-53f0db696a9c", 00:27:27.991 "base_bdev": "Nvme0n1", 00:27:27.991 "thin_provision": true, 00:27:27.991 "num_allocated_clusters": 0, 00:27:27.991 "snapshot": false, 00:27:27.991 "clone": false, 00:27:27.991 "esnap_clone": false 00:27:27.991 } 00:27:27.991 } 00:27:27.991 } 00:27:27.991 ] 00:27:27.991 12:03:54 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:27.991 12:03:54 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:27.991 12:03:54 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:27.991 [2024-05-14 12:03:55.076516] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:28.250 COMP_lvs0/lv0 00:27:28.250 12:03:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:28.250 12:03:55 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:28.509 12:03:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:28.509 [ 00:27:28.509 { 00:27:28.509 "name": "COMP_lvs0/lv0", 00:27:28.509 "aliases": [ 00:27:28.509 "8c863a7d-7ed4-5e42-84e2-b5302c61a82f" 00:27:28.509 ], 00:27:28.509 "product_name": "compress", 00:27:28.509 "block_size": 4096, 00:27:28.509 "num_blocks": 25088, 00:27:28.509 "uuid": "8c863a7d-7ed4-5e42-84e2-b5302c61a82f", 00:27:28.509 "assigned_rate_limits": { 00:27:28.509 "rw_ios_per_sec": 0, 00:27:28.509 "rw_mbytes_per_sec": 0, 00:27:28.509 "r_mbytes_per_sec": 0, 00:27:28.509 "w_mbytes_per_sec": 0 00:27:28.509 }, 00:27:28.509 "claimed": false, 00:27:28.509 "zoned": false, 00:27:28.509 "supported_io_types": { 00:27:28.509 "read": true, 00:27:28.509 "write": true, 00:27:28.509 "unmap": false, 00:27:28.509 "write_zeroes": true, 00:27:28.509 "flush": false, 00:27:28.509 "reset": false, 00:27:28.509 "compare": false, 00:27:28.509 "compare_and_write": false, 00:27:28.509 "abort": false, 00:27:28.509 "nvme_admin": false, 00:27:28.509 "nvme_io": false 00:27:28.509 }, 00:27:28.509 "driver_specific": { 00:27:28.509 "compress": { 00:27:28.509 "name": "COMP_lvs0/lv0", 00:27:28.509 "base_bdev_name": "1b428245-a7e6-4589-8c7f-105ab766ca0f" 00:27:28.509 } 00:27:28.509 } 00:27:28.509 } 00:27:28.509 ] 00:27:28.768 12:03:55 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:28.768 12:03:55 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:28.768 Running I/O for 3 seconds... 00:27:32.056 00:27:32.056 Latency(us) 00:27:32.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:32.056 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:32.056 Verification LBA range: start 0x0 length 0x3100 00:27:32.056 COMP_lvs0/lv0 : 3.00 3896.37 15.22 0.00 0.00 8157.51 662.48 7038.00 00:27:32.056 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:32.056 Verification LBA range: start 0x3100 length 0x3100 00:27:32.056 COMP_lvs0/lv0 : 3.00 3897.97 15.23 0.00 0.00 8166.28 573.44 6867.03 00:27:32.056 =================================================================================================================== 00:27:32.056 Total : 7794.34 30.45 0.00 0.00 8161.89 573.44 7038.00 00:27:32.056 0 00:27:32.056 12:03:58 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:27:32.056 12:03:58 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:32.056 12:03:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:32.315 12:03:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:32.315 12:03:59 compress_isal -- compress/compress.sh@78 -- # killprocess 1820330 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 1820330 ']' 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@950 -- # kill -0 1820330 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1820330 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1820330' 00:27:32.315 killing process with pid 1820330 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@965 -- # kill 1820330 00:27:32.315 Received shutdown signal, test time was about 3.000000 seconds 00:27:32.315 00:27:32.315 Latency(us) 00:27:32.315 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:32.315 =================================================================================================================== 00:27:32.315 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:32.315 12:03:59 compress_isal -- common/autotest_common.sh@970 -- # wait 1820330 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1821932 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:27:35.606 12:04:02 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1821932 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 1821932 ']' 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:35.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:35.606 12:04:02 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:35.606 [2024-05-14 12:04:02.133988] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:35.606 [2024-05-14 12:04:02.134056] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1821932 ] 00:27:35.606 [2024-05-14 12:04:02.261376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:35.606 [2024-05-14 12:04:02.367381] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:35.606 [2024-05-14 12:04:02.367471] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:35.606 [2024-05-14 12:04:02.367475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:36.174 12:04:03 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:36.174 12:04:03 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:27:36.174 12:04:03 compress_isal -- compress/compress.sh@58 -- # create_vols 00:27:36.174 12:04:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:36.174 12:04:03 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:36.742 12:04:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:36.742 12:04:03 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:37.001 12:04:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:37.277 [ 00:27:37.277 { 00:27:37.277 "name": "Nvme0n1", 00:27:37.277 "aliases": [ 00:27:37.277 "01000000-0000-0000-5cd2-e43197705251" 00:27:37.277 ], 00:27:37.277 "product_name": "NVMe disk", 00:27:37.277 "block_size": 512, 00:27:37.277 "num_blocks": 15002931888, 00:27:37.277 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:27:37.277 "assigned_rate_limits": { 00:27:37.277 "rw_ios_per_sec": 0, 00:27:37.277 "rw_mbytes_per_sec": 0, 00:27:37.277 "r_mbytes_per_sec": 0, 00:27:37.277 "w_mbytes_per_sec": 0 00:27:37.277 }, 00:27:37.277 "claimed": false, 00:27:37.277 "zoned": false, 00:27:37.277 "supported_io_types": { 00:27:37.277 "read": true, 00:27:37.277 "write": true, 00:27:37.277 "unmap": true, 00:27:37.277 "write_zeroes": true, 00:27:37.277 "flush": true, 00:27:37.277 "reset": true, 00:27:37.277 "compare": false, 00:27:37.277 "compare_and_write": false, 00:27:37.277 "abort": true, 00:27:37.277 "nvme_admin": true, 00:27:37.277 "nvme_io": true 00:27:37.277 }, 00:27:37.277 "driver_specific": { 00:27:37.277 "nvme": [ 00:27:37.277 { 00:27:37.277 "pci_address": "0000:5e:00.0", 00:27:37.277 "trid": { 00:27:37.277 "trtype": "PCIe", 00:27:37.277 "traddr": "0000:5e:00.0" 00:27:37.277 }, 00:27:37.277 "ctrlr_data": { 00:27:37.277 "cntlid": 0, 00:27:37.277 "vendor_id": "0x8086", 00:27:37.277 "model_number": "INTEL SSDPF2KX076TZO", 00:27:37.277 "serial_number": "PHAC0301002G7P6CGN", 00:27:37.277 "firmware_revision": "JCV10200", 00:27:37.277 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:27:37.277 "oacs": { 00:27:37.277 "security": 1, 00:27:37.277 "format": 1, 00:27:37.277 "firmware": 1, 00:27:37.277 "ns_manage": 1 00:27:37.277 }, 00:27:37.277 "multi_ctrlr": false, 00:27:37.277 "ana_reporting": false 00:27:37.277 }, 00:27:37.277 "vs": { 00:27:37.277 "nvme_version": "1.3" 00:27:37.277 }, 00:27:37.277 "ns_data": { 00:27:37.277 "id": 1, 00:27:37.277 "can_share": false 00:27:37.277 }, 00:27:37.277 "security": { 00:27:37.277 "opal": false 00:27:37.277 } 00:27:37.277 } 00:27:37.277 ], 00:27:37.277 "mp_policy": "active_passive" 00:27:37.277 } 00:27:37.277 } 00:27:37.277 ] 00:27:37.277 12:04:04 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:37.277 12:04:04 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:39.864 f67a0611-da4b-4645-901a-81b50ce1c88d 00:27:39.864 12:04:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:39.864 6abb3396-ebff-47e7-aa55-f39193092f95 00:27:39.864 12:04:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:39.864 12:04:06 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:40.124 12:04:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:40.124 [ 00:27:40.124 { 00:27:40.124 "name": "6abb3396-ebff-47e7-aa55-f39193092f95", 00:27:40.124 "aliases": [ 00:27:40.124 "lvs0/lv0" 00:27:40.124 ], 00:27:40.124 "product_name": "Logical Volume", 00:27:40.124 "block_size": 512, 00:27:40.124 "num_blocks": 204800, 00:27:40.124 "uuid": "6abb3396-ebff-47e7-aa55-f39193092f95", 00:27:40.124 "assigned_rate_limits": { 00:27:40.124 "rw_ios_per_sec": 0, 00:27:40.124 "rw_mbytes_per_sec": 0, 00:27:40.124 "r_mbytes_per_sec": 0, 00:27:40.124 "w_mbytes_per_sec": 0 00:27:40.124 }, 00:27:40.124 "claimed": false, 00:27:40.124 "zoned": false, 00:27:40.124 "supported_io_types": { 00:27:40.124 "read": true, 00:27:40.124 "write": true, 00:27:40.124 "unmap": true, 00:27:40.124 "write_zeroes": true, 00:27:40.124 "flush": false, 00:27:40.124 "reset": true, 00:27:40.124 "compare": false, 00:27:40.124 "compare_and_write": false, 00:27:40.124 "abort": false, 00:27:40.124 "nvme_admin": false, 00:27:40.124 "nvme_io": false 00:27:40.124 }, 00:27:40.124 "driver_specific": { 00:27:40.124 "lvol": { 00:27:40.124 "lvol_store_uuid": "f67a0611-da4b-4645-901a-81b50ce1c88d", 00:27:40.124 "base_bdev": "Nvme0n1", 00:27:40.124 "thin_provision": true, 00:27:40.124 "num_allocated_clusters": 0, 00:27:40.124 "snapshot": false, 00:27:40.124 "clone": false, 00:27:40.124 "esnap_clone": false 00:27:40.124 } 00:27:40.124 } 00:27:40.124 } 00:27:40.124 ] 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:40.383 12:04:07 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:40.383 12:04:07 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:40.383 [2024-05-14 12:04:07.441485] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:40.383 COMP_lvs0/lv0 00:27:40.383 12:04:07 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@897 -- # local i 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:27:40.383 12:04:07 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:40.642 12:04:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:40.901 [ 00:27:40.901 { 00:27:40.901 "name": "COMP_lvs0/lv0", 00:27:40.901 "aliases": [ 00:27:40.901 "15f0cd10-124d-57ff-b42f-d964353ef2d4" 00:27:40.901 ], 00:27:40.901 "product_name": "compress", 00:27:40.901 "block_size": 512, 00:27:40.901 "num_blocks": 200704, 00:27:40.901 "uuid": "15f0cd10-124d-57ff-b42f-d964353ef2d4", 00:27:40.901 "assigned_rate_limits": { 00:27:40.901 "rw_ios_per_sec": 0, 00:27:40.901 "rw_mbytes_per_sec": 0, 00:27:40.901 "r_mbytes_per_sec": 0, 00:27:40.901 "w_mbytes_per_sec": 0 00:27:40.901 }, 00:27:40.901 "claimed": false, 00:27:40.901 "zoned": false, 00:27:40.901 "supported_io_types": { 00:27:40.901 "read": true, 00:27:40.901 "write": true, 00:27:40.901 "unmap": false, 00:27:40.901 "write_zeroes": true, 00:27:40.901 "flush": false, 00:27:40.901 "reset": false, 00:27:40.901 "compare": false, 00:27:40.901 "compare_and_write": false, 00:27:40.901 "abort": false, 00:27:40.901 "nvme_admin": false, 00:27:40.901 "nvme_io": false 00:27:40.901 }, 00:27:40.901 "driver_specific": { 00:27:40.901 "compress": { 00:27:40.901 "name": "COMP_lvs0/lv0", 00:27:40.901 "base_bdev_name": "6abb3396-ebff-47e7-aa55-f39193092f95" 00:27:40.901 } 00:27:40.901 } 00:27:40.901 } 00:27:40.901 ] 00:27:40.901 12:04:07 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:27:40.901 12:04:07 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:41.160 I/O targets: 00:27:41.160 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:27:41.160 00:27:41.160 00:27:41.160 CUnit - A unit testing framework for C - Version 2.1-3 00:27:41.160 http://cunit.sourceforge.net/ 00:27:41.160 00:27:41.160 00:27:41.160 Suite: bdevio tests on: COMP_lvs0/lv0 00:27:41.160 Test: blockdev write read block ...passed 00:27:41.160 Test: blockdev write zeroes read block ...passed 00:27:41.160 Test: blockdev write zeroes read no split ...passed 00:27:41.160 Test: blockdev write zeroes read split ...passed 00:27:41.160 Test: blockdev write zeroes read split partial ...passed 00:27:41.160 Test: blockdev reset ...[2024-05-14 12:04:08.110453] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:27:41.160 passed 00:27:41.160 Test: blockdev write read 8 blocks ...passed 00:27:41.160 Test: blockdev write read size > 128k ...passed 00:27:41.160 Test: blockdev write read invalid size ...passed 00:27:41.160 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:41.160 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:41.160 Test: blockdev write read max offset ...passed 00:27:41.160 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:41.160 Test: blockdev writev readv 8 blocks ...passed 00:27:41.160 Test: blockdev writev readv 30 x 1block ...passed 00:27:41.160 Test: blockdev writev readv block ...passed 00:27:41.160 Test: blockdev writev readv size > 128k ...passed 00:27:41.160 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:41.160 Test: blockdev comparev and writev ...passed 00:27:41.160 Test: blockdev nvme passthru rw ...passed 00:27:41.160 Test: blockdev nvme passthru vendor specific ...passed 00:27:41.160 Test: blockdev nvme admin passthru ...passed 00:27:41.160 Test: blockdev copy ...passed 00:27:41.160 00:27:41.160 Run Summary: Type Total Ran Passed Failed Inactive 00:27:41.160 suites 1 1 n/a 0 0 00:27:41.160 tests 23 23 23 0 0 00:27:41.160 asserts 130 130 130 0 n/a 00:27:41.160 00:27:41.160 Elapsed time = 0.113 seconds 00:27:41.160 0 00:27:41.160 12:04:08 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:27:41.160 12:04:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:41.419 12:04:08 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:41.678 12:04:08 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:27:41.678 12:04:08 compress_isal -- compress/compress.sh@62 -- # killprocess 1821932 00:27:41.678 12:04:08 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 1821932 ']' 00:27:41.678 12:04:08 compress_isal -- common/autotest_common.sh@950 -- # kill -0 1821932 00:27:41.678 12:04:08 compress_isal -- common/autotest_common.sh@951 -- # uname 00:27:41.678 12:04:08 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1821932 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1821932' 00:27:41.679 killing process with pid 1821932 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@965 -- # kill 1821932 00:27:41.679 12:04:08 compress_isal -- common/autotest_common.sh@970 -- # wait 1821932 00:27:44.964 12:04:11 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:27:44.964 12:04:11 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:27:44.964 00:27:44.964 real 0m47.345s 00:27:44.964 user 1m50.815s 00:27:44.964 sys 0m4.206s 00:27:44.964 12:04:11 compress_isal -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:44.964 12:04:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:44.964 ************************************ 00:27:44.964 END TEST compress_isal 00:27:44.964 ************************************ 00:27:44.964 12:04:11 -- spdk/autotest.sh@348 -- # '[' 0 -eq 1 ']' 00:27:44.964 12:04:11 -- spdk/autotest.sh@352 -- # '[' 1 -eq 1 ']' 00:27:44.964 12:04:11 -- spdk/autotest.sh@353 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:27:44.964 12:04:11 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:44.964 12:04:11 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:44.964 12:04:11 -- common/autotest_common.sh@10 -- # set +x 00:27:44.964 ************************************ 00:27:44.964 START TEST blockdev_crypto_aesni 00:27:44.964 ************************************ 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:27:44.964 * Looking for test storage... 00:27:44.964 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1823229 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:44.964 12:04:11 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1823229 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@827 -- # '[' -z 1823229 ']' 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:44.964 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:44.964 12:04:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:44.964 [2024-05-14 12:04:11.733085] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:44.964 [2024-05-14 12:04:11.733174] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1823229 ] 00:27:44.964 [2024-05-14 12:04:11.863654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:44.964 [2024-05-14 12:04:11.965855] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.922 12:04:12 blockdev_crypto_aesni -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:45.922 12:04:12 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # return 0 00:27:45.922 12:04:12 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:45.922 12:04:12 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:27:45.922 12:04:12 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:27:45.922 12:04:12 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:45.922 12:04:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:45.922 [2024-05-14 12:04:12.664056] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:45.923 [2024-05-14 12:04:12.672088] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:45.923 [2024-05-14 12:04:12.680103] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:45.923 [2024-05-14 12:04:12.754396] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:48.452 true 00:27:48.452 true 00:27:48.452 true 00:27:48.452 true 00:27:48.452 Malloc0 00:27:48.452 Malloc1 00:27:48.452 Malloc2 00:27:48.452 Malloc3 00:27:48.452 [2024-05-14 12:04:15.135339] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:48.452 crypto_ram 00:27:48.452 [2024-05-14 12:04:15.143353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:48.452 crypto_ram2 00:27:48.452 [2024-05-14 12:04:15.151374] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:48.452 crypto_ram3 00:27:48.452 [2024-05-14 12:04:15.159394] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:48.452 crypto_ram4 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "006b08d4-1eab-52d9-9f64-04f71cda6d5a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "006b08d4-1eab-52d9-9f64-04f71cda6d5a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2114b533-ba17-5ebe-9225-b7b8111343e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2114b533-ba17-5ebe-9225-b7b8111343e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6893430c-8a19-591e-ae73-6db36af8c077"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6893430c-8a19-591e-ae73-6db36af8c077",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "164bf13c-3a27-56ad-887b-d2e54898a2b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "164bf13c-3a27-56ad-887b-d2e54898a2b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:48.452 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1823229 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@946 -- # '[' -z 1823229 ']' 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # kill -0 1823229 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # uname 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1823229 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1823229' 00:27:48.452 killing process with pid 1823229 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@965 -- # kill 1823229 00:27:48.452 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@970 -- # wait 1823229 00:27:49.020 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:49.020 12:04:15 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:49.020 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:27:49.020 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:49.020 12:04:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:49.020 ************************************ 00:27:49.020 START TEST bdev_hello_world 00:27:49.020 ************************************ 00:27:49.020 12:04:16 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:49.020 [2024-05-14 12:04:16.093315] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:49.020 [2024-05-14 12:04:16.093371] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1823811 ] 00:27:49.278 [2024-05-14 12:04:16.222704] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.278 [2024-05-14 12:04:16.323946] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.278 [2024-05-14 12:04:16.345242] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:49.278 [2024-05-14 12:04:16.353262] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:49.278 [2024-05-14 12:04:16.361281] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:49.538 [2024-05-14 12:04:16.469501] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:52.074 [2024-05-14 12:04:18.699998] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:52.074 [2024-05-14 12:04:18.700073] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:52.074 [2024-05-14 12:04:18.700089] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:52.074 [2024-05-14 12:04:18.708018] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:52.074 [2024-05-14 12:04:18.708041] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:52.074 [2024-05-14 12:04:18.708054] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:52.074 [2024-05-14 12:04:18.716038] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:52.074 [2024-05-14 12:04:18.716060] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:52.074 [2024-05-14 12:04:18.716072] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:52.074 [2024-05-14 12:04:18.724060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:52.074 [2024-05-14 12:04:18.724079] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:52.074 [2024-05-14 12:04:18.724091] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:52.074 [2024-05-14 12:04:18.802412] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:52.074 [2024-05-14 12:04:18.802454] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:52.074 [2024-05-14 12:04:18.802473] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:52.074 [2024-05-14 12:04:18.803768] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:52.074 [2024-05-14 12:04:18.803850] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:52.074 [2024-05-14 12:04:18.803866] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:52.074 [2024-05-14 12:04:18.803911] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:52.074 00:27:52.074 [2024-05-14 12:04:18.803930] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:52.333 00:27:52.333 real 0m3.198s 00:27:52.333 user 0m2.778s 00:27:52.333 sys 0m0.382s 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:52.333 ************************************ 00:27:52.333 END TEST bdev_hello_world 00:27:52.333 ************************************ 00:27:52.333 12:04:19 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:52.333 12:04:19 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:52.333 12:04:19 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:52.333 12:04:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:52.333 ************************************ 00:27:52.333 START TEST bdev_bounds 00:27:52.333 ************************************ 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1824311 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1824311' 00:27:52.333 Process bdevio pid: 1824311 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1824311 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 1824311 ']' 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:52.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:52.333 12:04:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:52.333 [2024-05-14 12:04:19.388587] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:52.333 [2024-05-14 12:04:19.388655] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1824311 ] 00:27:52.592 [2024-05-14 12:04:19.520419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:52.592 [2024-05-14 12:04:19.620685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:52.592 [2024-05-14 12:04:19.620763] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:52.592 [2024-05-14 12:04:19.620768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:52.592 [2024-05-14 12:04:19.642130] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:52.592 [2024-05-14 12:04:19.650160] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:52.592 [2024-05-14 12:04:19.658180] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:52.852 [2024-05-14 12:04:19.767328] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:55.393 [2024-05-14 12:04:21.993390] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:55.393 [2024-05-14 12:04:21.993478] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:55.393 [2024-05-14 12:04:21.993493] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.393 [2024-05-14 12:04:22.001417] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:55.393 [2024-05-14 12:04:22.001439] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:55.393 [2024-05-14 12:04:22.001450] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.393 [2024-05-14 12:04:22.009431] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:55.393 [2024-05-14 12:04:22.009452] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:55.393 [2024-05-14 12:04:22.009463] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.393 [2024-05-14 12:04:22.017452] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:55.393 [2024-05-14 12:04:22.017471] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:55.393 [2024-05-14 12:04:22.017482] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.393 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:55.393 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:27:55.393 12:04:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:55.393 I/O targets: 00:27:55.393 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:27:55.393 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:27:55.393 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:27:55.393 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:27:55.393 00:27:55.393 00:27:55.393 CUnit - A unit testing framework for C - Version 2.1-3 00:27:55.393 http://cunit.sourceforge.net/ 00:27:55.393 00:27:55.393 00:27:55.393 Suite: bdevio tests on: crypto_ram4 00:27:55.393 Test: blockdev write read block ...passed 00:27:55.393 Test: blockdev write zeroes read block ...passed 00:27:55.393 Test: blockdev write zeroes read no split ...passed 00:27:55.393 Test: blockdev write zeroes read split ...passed 00:27:55.393 Test: blockdev write zeroes read split partial ...passed 00:27:55.393 Test: blockdev reset ...passed 00:27:55.393 Test: blockdev write read 8 blocks ...passed 00:27:55.393 Test: blockdev write read size > 128k ...passed 00:27:55.393 Test: blockdev write read invalid size ...passed 00:27:55.393 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:55.393 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:55.393 Test: blockdev write read max offset ...passed 00:27:55.393 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:55.393 Test: blockdev writev readv 8 blocks ...passed 00:27:55.393 Test: blockdev writev readv 30 x 1block ...passed 00:27:55.393 Test: blockdev writev readv block ...passed 00:27:55.393 Test: blockdev writev readv size > 128k ...passed 00:27:55.393 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:55.393 Test: blockdev comparev and writev ...passed 00:27:55.393 Test: blockdev nvme passthru rw ...passed 00:27:55.393 Test: blockdev nvme passthru vendor specific ...passed 00:27:55.393 Test: blockdev nvme admin passthru ...passed 00:27:55.393 Test: blockdev copy ...passed 00:27:55.393 Suite: bdevio tests on: crypto_ram3 00:27:55.393 Test: blockdev write read block ...passed 00:27:55.393 Test: blockdev write zeroes read block ...passed 00:27:55.393 Test: blockdev write zeroes read no split ...passed 00:27:55.393 Test: blockdev write zeroes read split ...passed 00:27:55.393 Test: blockdev write zeroes read split partial ...passed 00:27:55.393 Test: blockdev reset ...passed 00:27:55.393 Test: blockdev write read 8 blocks ...passed 00:27:55.393 Test: blockdev write read size > 128k ...passed 00:27:55.393 Test: blockdev write read invalid size ...passed 00:27:55.393 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:55.393 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:55.393 Test: blockdev write read max offset ...passed 00:27:55.393 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:55.393 Test: blockdev writev readv 8 blocks ...passed 00:27:55.393 Test: blockdev writev readv 30 x 1block ...passed 00:27:55.393 Test: blockdev writev readv block ...passed 00:27:55.393 Test: blockdev writev readv size > 128k ...passed 00:27:55.393 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:55.393 Test: blockdev comparev and writev ...passed 00:27:55.393 Test: blockdev nvme passthru rw ...passed 00:27:55.393 Test: blockdev nvme passthru vendor specific ...passed 00:27:55.393 Test: blockdev nvme admin passthru ...passed 00:27:55.393 Test: blockdev copy ...passed 00:27:55.393 Suite: bdevio tests on: crypto_ram2 00:27:55.393 Test: blockdev write read block ...passed 00:27:55.393 Test: blockdev write zeroes read block ...passed 00:27:55.393 Test: blockdev write zeroes read no split ...passed 00:27:55.393 Test: blockdev write zeroes read split ...passed 00:27:55.393 Test: blockdev write zeroes read split partial ...passed 00:27:55.393 Test: blockdev reset ...passed 00:27:55.393 Test: blockdev write read 8 blocks ...passed 00:27:55.393 Test: blockdev write read size > 128k ...passed 00:27:55.393 Test: blockdev write read invalid size ...passed 00:27:55.393 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:55.393 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:55.393 Test: blockdev write read max offset ...passed 00:27:55.393 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:55.393 Test: blockdev writev readv 8 blocks ...passed 00:27:55.393 Test: blockdev writev readv 30 x 1block ...passed 00:27:55.393 Test: blockdev writev readv block ...passed 00:27:55.393 Test: blockdev writev readv size > 128k ...passed 00:27:55.393 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:55.393 Test: blockdev comparev and writev ...passed 00:27:55.393 Test: blockdev nvme passthru rw ...passed 00:27:55.393 Test: blockdev nvme passthru vendor specific ...passed 00:27:55.393 Test: blockdev nvme admin passthru ...passed 00:27:55.393 Test: blockdev copy ...passed 00:27:55.393 Suite: bdevio tests on: crypto_ram 00:27:55.393 Test: blockdev write read block ...passed 00:27:55.393 Test: blockdev write zeroes read block ...passed 00:27:55.393 Test: blockdev write zeroes read no split ...passed 00:27:55.393 Test: blockdev write zeroes read split ...passed 00:27:55.393 Test: blockdev write zeroes read split partial ...passed 00:27:55.393 Test: blockdev reset ...passed 00:27:55.393 Test: blockdev write read 8 blocks ...passed 00:27:55.393 Test: blockdev write read size > 128k ...passed 00:27:55.393 Test: blockdev write read invalid size ...passed 00:27:55.393 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:55.393 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:55.393 Test: blockdev write read max offset ...passed 00:27:55.393 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:55.393 Test: blockdev writev readv 8 blocks ...passed 00:27:55.393 Test: blockdev writev readv 30 x 1block ...passed 00:27:55.393 Test: blockdev writev readv block ...passed 00:27:55.393 Test: blockdev writev readv size > 128k ...passed 00:27:55.393 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:55.393 Test: blockdev comparev and writev ...passed 00:27:55.393 Test: blockdev nvme passthru rw ...passed 00:27:55.393 Test: blockdev nvme passthru vendor specific ...passed 00:27:55.393 Test: blockdev nvme admin passthru ...passed 00:27:55.393 Test: blockdev copy ...passed 00:27:55.393 00:27:55.393 Run Summary: Type Total Ran Passed Failed Inactive 00:27:55.393 suites 4 4 n/a 0 0 00:27:55.393 tests 92 92 92 0 0 00:27:55.393 asserts 520 520 520 0 n/a 00:27:55.393 00:27:55.393 Elapsed time = 0.541 seconds 00:27:55.652 0 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1824311 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 1824311 ']' 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 1824311 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1824311 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1824311' 00:27:55.652 killing process with pid 1824311 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@965 -- # kill 1824311 00:27:55.652 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@970 -- # wait 1824311 00:27:55.912 12:04:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:27:55.912 00:27:55.912 real 0m3.653s 00:27:55.912 user 0m10.026s 00:27:55.912 sys 0m0.577s 00:27:55.912 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:55.912 12:04:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:55.912 ************************************ 00:27:55.912 END TEST bdev_bounds 00:27:55.912 ************************************ 00:27:56.171 12:04:23 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:56.171 12:04:23 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:27:56.171 12:04:23 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:56.171 12:04:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:56.171 ************************************ 00:27:56.171 START TEST bdev_nbd 00:27:56.171 ************************************ 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1824814 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1824814 /var/tmp/spdk-nbd.sock 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 1824814 ']' 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:56.171 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:56.171 12:04:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:56.171 [2024-05-14 12:04:23.130583] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:27:56.171 [2024-05-14 12:04:23.130651] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:56.430 [2024-05-14 12:04:23.261092] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:56.430 [2024-05-14 12:04:23.358298] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:56.430 [2024-05-14 12:04:23.379589] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:56.430 [2024-05-14 12:04:23.387610] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:56.430 [2024-05-14 12:04:23.395627] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:56.430 [2024-05-14 12:04:23.500936] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:58.995 [2024-05-14 12:04:25.729706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:58.995 [2024-05-14 12:04:25.729770] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:58.995 [2024-05-14 12:04:25.729786] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:58.995 [2024-05-14 12:04:25.737726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:58.995 [2024-05-14 12:04:25.737748] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:58.995 [2024-05-14 12:04:25.737760] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:58.995 [2024-05-14 12:04:25.745747] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:58.995 [2024-05-14 12:04:25.745769] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:58.995 [2024-05-14 12:04:25.745780] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:58.995 [2024-05-14 12:04:25.753769] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:58.995 [2024-05-14 12:04:25.753797] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:58.995 [2024-05-14 12:04:25.753810] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:58.995 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:58.996 12:04:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:59.255 1+0 records in 00:27:59.255 1+0 records out 00:27:59.255 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292114 s, 14.0 MB/s 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:59.255 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:59.514 1+0 records in 00:27:59.514 1+0 records out 00:27:59.514 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308029 s, 13.3 MB/s 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:59.514 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:59.773 1+0 records in 00:27:59.773 1+0 records out 00:27:59.773 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291087 s, 14.1 MB/s 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:59.773 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:00.033 1+0 records in 00:28:00.033 1+0 records out 00:28:00.033 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000327984 s, 12.5 MB/s 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:00.033 12:04:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:00.033 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:00.033 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:00.292 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:00.292 { 00:28:00.292 "nbd_device": "/dev/nbd0", 00:28:00.292 "bdev_name": "crypto_ram" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd1", 00:28:00.293 "bdev_name": "crypto_ram2" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd2", 00:28:00.293 "bdev_name": "crypto_ram3" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd3", 00:28:00.293 "bdev_name": "crypto_ram4" 00:28:00.293 } 00:28:00.293 ]' 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd0", 00:28:00.293 "bdev_name": "crypto_ram" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd1", 00:28:00.293 "bdev_name": "crypto_ram2" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd2", 00:28:00.293 "bdev_name": "crypto_ram3" 00:28:00.293 }, 00:28:00.293 { 00:28:00.293 "nbd_device": "/dev/nbd3", 00:28:00.293 "bdev_name": "crypto_ram4" 00:28:00.293 } 00:28:00.293 ]' 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:00.293 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:00.552 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:00.811 12:04:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:01.070 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:01.329 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:01.589 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:01.848 /dev/nbd0 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:01.848 1+0 records in 00:28:01.848 1+0 records out 00:28:01.848 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304967 s, 13.4 MB/s 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:01.848 12:04:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:28:02.107 /dev/nbd1 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.107 1+0 records in 00:28:02.107 1+0 records out 00:28:02.107 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253714 s, 16.1 MB/s 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:02.107 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:28:02.366 /dev/nbd10 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.366 1+0 records in 00:28:02.366 1+0 records out 00:28:02.366 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351238 s, 11.7 MB/s 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:02.366 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:28:02.625 /dev/nbd11 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.625 1+0 records in 00:28:02.625 1+0 records out 00:28:02.625 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317768 s, 12.9 MB/s 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:02.625 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd0", 00:28:02.883 "bdev_name": "crypto_ram" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd1", 00:28:02.883 "bdev_name": "crypto_ram2" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd10", 00:28:02.883 "bdev_name": "crypto_ram3" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd11", 00:28:02.883 "bdev_name": "crypto_ram4" 00:28:02.883 } 00:28:02.883 ]' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd0", 00:28:02.883 "bdev_name": "crypto_ram" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd1", 00:28:02.883 "bdev_name": "crypto_ram2" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd10", 00:28:02.883 "bdev_name": "crypto_ram3" 00:28:02.883 }, 00:28:02.883 { 00:28:02.883 "nbd_device": "/dev/nbd11", 00:28:02.883 "bdev_name": "crypto_ram4" 00:28:02.883 } 00:28:02.883 ]' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:28:02.883 /dev/nbd1 00:28:02.883 /dev/nbd10 00:28:02.883 /dev/nbd11' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:28:02.883 /dev/nbd1 00:28:02.883 /dev/nbd10 00:28:02.883 /dev/nbd11' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:28:02.883 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:28:02.884 256+0 records in 00:28:02.884 256+0 records out 00:28:02.884 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113909 s, 92.1 MB/s 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:28:02.884 256+0 records in 00:28:02.884 256+0 records out 00:28:02.884 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0615298 s, 17.0 MB/s 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:02.884 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:28:03.142 256+0 records in 00:28:03.142 256+0 records out 00:28:03.142 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0486614 s, 21.5 MB/s 00:28:03.142 12:04:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:28:03.142 256+0 records in 00:28:03.142 256+0 records out 00:28:03.142 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0452113 s, 23.2 MB/s 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:28:03.142 256+0 records in 00:28:03.142 256+0 records out 00:28:03.142 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0544503 s, 19.3 MB/s 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:03.142 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:03.143 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:03.401 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:03.402 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:03.660 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:03.919 12:04:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:28:04.177 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:04.178 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:28:04.437 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:28:04.438 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:28:04.697 malloc_lvol_verify 00:28:04.697 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:28:04.956 8e22a618-f899-4094-bd2f-a3a63227c330 00:28:04.956 12:04:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:05.215 3634a46c-2b7e-421c-bcd3-259c7954da10 00:28:05.215 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:05.484 /dev/nbd0 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:05.484 mke2fs 1.46.5 (30-Dec-2021) 00:28:05.484 Discarding device blocks: 0/4096 done 00:28:05.484 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:05.484 00:28:05.484 Allocating group tables: 0/1 done 00:28:05.484 Writing inode tables: 0/1 done 00:28:05.484 Creating journal (1024 blocks): done 00:28:05.484 Writing superblocks and filesystem accounting information: 0/1 done 00:28:05.484 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:05.484 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1824814 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 1824814 ']' 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 1824814 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1824814 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1824814' 00:28:05.749 killing process with pid 1824814 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@965 -- # kill 1824814 00:28:05.749 12:04:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@970 -- # wait 1824814 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:06.318 00:28:06.318 real 0m10.130s 00:28:06.318 user 0m13.113s 00:28:06.318 sys 0m3.948s 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:06.318 ************************************ 00:28:06.318 END TEST bdev_nbd 00:28:06.318 ************************************ 00:28:06.318 12:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:06.318 12:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:06.318 12:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:06.318 12:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:06.318 ************************************ 00:28:06.318 START TEST bdev_fio 00:28:06.318 ************************************ 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:06.318 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.318 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:06.319 12:04:33 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:06.578 ************************************ 00:28:06.578 START TEST bdev_fio_rw_verify 00:28:06.578 ************************************ 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:06.578 12:04:33 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.838 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.838 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.838 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.838 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.838 fio-3.35 00:28:06.838 Starting 4 threads 00:28:21.719 00:28:21.719 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1826739: Tue May 14 12:04:46 2024 00:28:21.719 read: IOPS=20.8k, BW=81.1MiB/s (85.0MB/s)(811MiB/10001msec) 00:28:21.719 slat (usec): min=11, max=516, avg=65.83, stdev=37.56 00:28:21.719 clat (usec): min=10, max=1649, avg=345.76, stdev=220.99 00:28:21.719 lat (usec): min=43, max=1841, avg=411.59, stdev=241.58 00:28:21.719 clat percentiles (usec): 00:28:21.719 | 50.000th=[ 297], 99.000th=[ 1074], 99.900th=[ 1352], 99.990th=[ 1483], 00:28:21.719 | 99.999th=[ 1565] 00:28:21.719 write: IOPS=22.7k, BW=88.8MiB/s (93.1MB/s)(868MiB/9769msec); 0 zone resets 00:28:21.719 slat (usec): min=14, max=1463, avg=78.55, stdev=38.25 00:28:21.719 clat (usec): min=24, max=1990, avg=419.55, stdev=263.80 00:28:21.719 lat (usec): min=51, max=2238, avg=498.11, stdev=284.71 00:28:21.719 clat percentiles (usec): 00:28:21.719 | 50.000th=[ 371], 99.000th=[ 1319], 99.900th=[ 1696], 99.990th=[ 1876], 00:28:21.719 | 99.999th=[ 1958] 00:28:21.719 bw ( KiB/s): min=72328, max=121608, per=98.10%, avg=89232.84, stdev=4498.22, samples=76 00:28:21.719 iops : min=18082, max=30402, avg=22308.21, stdev=1124.55, samples=76 00:28:21.719 lat (usec) : 20=0.01%, 50=0.40%, 100=6.47%, 250=27.33%, 500=40.73% 00:28:21.719 lat (usec) : 750=16.78%, 1000=5.51% 00:28:21.719 lat (msec) : 2=2.78% 00:28:21.719 cpu : usr=99.59%, sys=0.01%, ctx=136, majf=0, minf=273 00:28:21.719 IO depths : 1=10.1%, 2=25.6%, 4=51.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:21.719 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:21.719 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:21.719 issued rwts: total=207602,222139,0,0 short=0,0,0,0 dropped=0,0,0,0 00:28:21.719 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:21.719 00:28:21.719 Run status group 0 (all jobs): 00:28:21.719 READ: bw=81.1MiB/s (85.0MB/s), 81.1MiB/s-81.1MiB/s (85.0MB/s-85.0MB/s), io=811MiB (850MB), run=10001-10001msec 00:28:21.719 WRITE: bw=88.8MiB/s (93.1MB/s), 88.8MiB/s-88.8MiB/s (93.1MB/s-93.1MB/s), io=868MiB (910MB), run=9769-9769msec 00:28:21.719 00:28:21.719 real 0m13.502s 00:28:21.719 user 0m45.592s 00:28:21.719 sys 0m0.508s 00:28:21.719 12:04:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:21.719 12:04:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:28:21.719 ************************************ 00:28:21.719 END TEST bdev_fio_rw_verify 00:28:21.719 ************************************ 00:28:21.719 12:04:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:21.720 12:04:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "006b08d4-1eab-52d9-9f64-04f71cda6d5a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "006b08d4-1eab-52d9-9f64-04f71cda6d5a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2114b533-ba17-5ebe-9225-b7b8111343e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2114b533-ba17-5ebe-9225-b7b8111343e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6893430c-8a19-591e-ae73-6db36af8c077"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6893430c-8a19-591e-ae73-6db36af8c077",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "164bf13c-3a27-56ad-887b-d2e54898a2b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "164bf13c-3a27-56ad-887b-d2e54898a2b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:28:21.720 crypto_ram2 00:28:21.720 crypto_ram3 00:28:21.720 crypto_ram4 ]] 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "006b08d4-1eab-52d9-9f64-04f71cda6d5a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "006b08d4-1eab-52d9-9f64-04f71cda6d5a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2114b533-ba17-5ebe-9225-b7b8111343e2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2114b533-ba17-5ebe-9225-b7b8111343e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6893430c-8a19-591e-ae73-6db36af8c077"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6893430c-8a19-591e-ae73-6db36af8c077",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "164bf13c-3a27-56ad-887b-d2e54898a2b0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "164bf13c-3a27-56ad-887b-d2e54898a2b0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:28:21.720 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:21.721 ************************************ 00:28:21.721 START TEST bdev_fio_trim 00:28:21.721 ************************************ 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:21.721 12:04:47 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:21.721 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:21.721 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:21.721 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:21.721 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:21.721 fio-3.35 00:28:21.721 Starting 4 threads 00:28:33.959 00:28:33.959 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1828634: Tue May 14 12:05:00 2024 00:28:33.959 write: IOPS=34.1k, BW=133MiB/s (140MB/s)(1333MiB/10001msec); 0 zone resets 00:28:33.959 slat (usec): min=16, max=1274, avg=63.19, stdev=30.94 00:28:33.959 clat (usec): min=45, max=1840, avg=306.21, stdev=176.25 00:28:33.959 lat (usec): min=71, max=2215, avg=369.40, stdev=195.15 00:28:33.959 clat percentiles (usec): 00:28:33.959 | 50.000th=[ 265], 99.000th=[ 848], 99.900th=[ 988], 99.990th=[ 1106], 00:28:33.959 | 99.999th=[ 1762] 00:28:33.959 bw ( KiB/s): min=126840, max=202253, per=100.00%, avg=136889.53, stdev=5701.87, samples=76 00:28:33.959 iops : min=31710, max=50563, avg=34222.37, stdev=1425.46, samples=76 00:28:33.959 trim: IOPS=34.1k, BW=133MiB/s (140MB/s)(1333MiB/10001msec); 0 zone resets 00:28:33.959 slat (usec): min=6, max=361, avg=18.59, stdev= 8.22 00:28:33.959 clat (usec): min=63, max=1598, avg=286.79, stdev=118.93 00:28:33.959 lat (usec): min=73, max=1611, avg=305.37, stdev=121.31 00:28:33.959 clat percentiles (usec): 00:28:33.959 | 50.000th=[ 273], 99.000th=[ 586], 99.900th=[ 685], 99.990th=[ 783], 00:28:33.959 | 99.999th=[ 1303] 00:28:33.959 bw ( KiB/s): min=126840, max=202277, per=100.00%, avg=136890.79, stdev=5702.45, samples=76 00:28:33.959 iops : min=31710, max=50569, avg=34222.68, stdev=1425.60, samples=76 00:28:33.959 lat (usec) : 50=0.01%, 100=3.57%, 250=41.35%, 500=46.07%, 750=7.45% 00:28:33.959 lat (usec) : 1000=1.52% 00:28:33.959 lat (msec) : 2=0.04% 00:28:33.959 cpu : usr=99.61%, sys=0.00%, ctx=65, majf=0, minf=114 00:28:33.959 IO depths : 1=7.6%, 2=26.4%, 4=52.8%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:33.959 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:33.959 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:33.959 issued rwts: total=0,341174,341174,0 short=0,0,0,0 dropped=0,0,0,0 00:28:33.959 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:33.959 00:28:33.959 Run status group 0 (all jobs): 00:28:33.959 WRITE: bw=133MiB/s (140MB/s), 133MiB/s-133MiB/s (140MB/s-140MB/s), io=1333MiB (1397MB), run=10001-10001msec 00:28:33.959 TRIM: bw=133MiB/s (140MB/s), 133MiB/s-133MiB/s (140MB/s-140MB/s), io=1333MiB (1397MB), run=10001-10001msec 00:28:33.959 00:28:33.959 real 0m13.464s 00:28:33.959 user 0m45.372s 00:28:33.959 sys 0m0.484s 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:33.959 ************************************ 00:28:33.959 END TEST bdev_fio_trim 00:28:33.959 ************************************ 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:33.959 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:33.959 00:28:33.959 real 0m27.352s 00:28:33.959 user 1m31.160s 00:28:33.959 sys 0m1.196s 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:33.959 ************************************ 00:28:33.959 END TEST bdev_fio 00:28:33.959 ************************************ 00:28:33.959 12:05:00 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:33.959 12:05:00 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:33.959 12:05:00 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:28:33.959 12:05:00 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:33.959 12:05:00 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:33.959 ************************************ 00:28:33.959 START TEST bdev_verify 00:28:33.959 ************************************ 00:28:33.959 12:05:00 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:33.959 [2024-05-14 12:05:00.790093] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:33.959 [2024-05-14 12:05:00.790155] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1830013 ] 00:28:33.959 [2024-05-14 12:05:00.918065] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:33.959 [2024-05-14 12:05:01.019469] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:33.959 [2024-05-14 12:05:01.019475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.959 [2024-05-14 12:05:01.040813] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:34.218 [2024-05-14 12:05:01.048845] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:34.218 [2024-05-14 12:05:01.056867] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:34.218 [2024-05-14 12:05:01.161394] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:36.751 [2024-05-14 12:05:03.381021] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:36.751 [2024-05-14 12:05:03.381097] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:36.751 [2024-05-14 12:05:03.381112] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:36.751 [2024-05-14 12:05:03.389026] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:36.751 [2024-05-14 12:05:03.389047] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:36.751 [2024-05-14 12:05:03.389059] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:36.751 [2024-05-14 12:05:03.397046] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:36.751 [2024-05-14 12:05:03.397065] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:36.751 [2024-05-14 12:05:03.397077] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:36.751 [2024-05-14 12:05:03.405069] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:36.751 [2024-05-14 12:05:03.405088] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:36.751 [2024-05-14 12:05:03.405100] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:36.751 Running I/O for 5 seconds... 00:28:42.032 00:28:42.032 Latency(us) 00:28:42.032 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:42.032 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x0 length 0x1000 00:28:42.032 crypto_ram : 5.06 480.31 1.88 0.00 0.00 265644.67 18464.06 169595.77 00:28:42.032 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x1000 length 0x1000 00:28:42.032 crypto_ram : 5.06 480.37 1.88 0.00 0.00 265441.44 18464.06 169595.77 00:28:42.032 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x0 length 0x1000 00:28:42.032 crypto_ram2 : 5.07 480.12 1.88 0.00 0.00 264837.25 9516.97 160477.72 00:28:42.032 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x1000 length 0x1000 00:28:42.032 crypto_ram2 : 5.06 480.23 1.88 0.00 0.00 264664.15 9573.95 160477.72 00:28:42.032 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x0 length 0x1000 00:28:42.032 crypto_ram3 : 5.05 3772.59 14.74 0.00 0.00 33611.26 6069.20 28721.86 00:28:42.032 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x1000 length 0x1000 00:28:42.032 crypto_ram3 : 5.05 3777.06 14.75 0.00 0.00 33560.33 7921.31 28835.84 00:28:42.032 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x0 length 0x1000 00:28:42.032 crypto_ram4 : 5.06 3781.86 14.77 0.00 0.00 33433.94 1068.52 26214.40 00:28:42.032 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:42.032 Verification LBA range: start 0x1000 length 0x1000 00:28:42.032 crypto_ram4 : 5.06 3795.80 14.83 0.00 0.00 33317.02 2393.49 25872.47 00:28:42.032 =================================================================================================================== 00:28:42.032 Total : 17048.35 66.60 0.00 0.00 59626.60 1068.52 169595.77 00:28:42.032 00:28:42.032 real 0m8.259s 00:28:42.032 user 0m15.644s 00:28:42.032 sys 0m0.389s 00:28:42.032 12:05:08 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:42.032 12:05:08 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:42.032 ************************************ 00:28:42.032 END TEST bdev_verify 00:28:42.032 ************************************ 00:28:42.032 12:05:09 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:42.032 12:05:09 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:28:42.032 12:05:09 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:42.032 12:05:09 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:42.032 ************************************ 00:28:42.032 START TEST bdev_verify_big_io 00:28:42.032 ************************************ 00:28:42.032 12:05:09 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:42.291 [2024-05-14 12:05:09.136941] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:42.291 [2024-05-14 12:05:09.137007] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1831075 ] 00:28:42.291 [2024-05-14 12:05:09.266341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:42.291 [2024-05-14 12:05:09.368755] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:42.291 [2024-05-14 12:05:09.368761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:42.550 [2024-05-14 12:05:09.390125] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:42.550 [2024-05-14 12:05:09.398159] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:42.550 [2024-05-14 12:05:09.406179] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:42.550 [2024-05-14 12:05:09.513215] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:45.083 [2024-05-14 12:05:11.741342] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:45.083 [2024-05-14 12:05:11.741428] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:45.083 [2024-05-14 12:05:11.741445] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:45.083 [2024-05-14 12:05:11.749361] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:45.083 [2024-05-14 12:05:11.749386] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:45.083 [2024-05-14 12:05:11.749405] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:45.083 [2024-05-14 12:05:11.757377] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:45.083 [2024-05-14 12:05:11.757395] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:45.083 [2024-05-14 12:05:11.757412] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:45.083 [2024-05-14 12:05:11.765405] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:45.083 [2024-05-14 12:05:11.765423] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:45.083 [2024-05-14 12:05:11.765434] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:45.083 Running I/O for 5 seconds... 00:28:47.618 [2024-05-14 12:05:14.421564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.422814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.423205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.423597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.426271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.427085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.428675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.430064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.431968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.432373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.432765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.433450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.436152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.436997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.438300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.439834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.441382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.441786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.442173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.443459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.445590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.446981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.448270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.449761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.450824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.451221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.451627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.453073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.454606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.455927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.457463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.458992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.459732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.460128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.461346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.462623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.464936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.466227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.467761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.469240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.470071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.470476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.618 [2024-05-14 12:05:14.471978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.473620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.476311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.477900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.479432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.480830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.481623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.482473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.483736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.485252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.487658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.489183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.490720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.491709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.492609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.494064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.495345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.496878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.499651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.501336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.502948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.503336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.504498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.505797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.507326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.508852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.511568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.513108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.513960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.514359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.516452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.517884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.519418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.521013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.523896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.525581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.525970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.526356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.527971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.529506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.531040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.532139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.534802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.535800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.536190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.536591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.538244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.539776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.541313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.541739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.544393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.544806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.545194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.546017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.547939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.549449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.550641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.551799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.554027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.554430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.554823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.556343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.558286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.559829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.560265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.561761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.563225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.563628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.564354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.565649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.567490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.568370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.569644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.571045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.572774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.574196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.575469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.576165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.577903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.578298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.578693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.579594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.582187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.583407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.583458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.619 [2024-05-14 12:05:14.583849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.585497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.586916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.586966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.587484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.588721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.589120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.589164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.589563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.589991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.591561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.591610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.592184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.593488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.593888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.593935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.595218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.595657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.596451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.596502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.598163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.599578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.599978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.600025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.601370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.601793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.602518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.602567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.603749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.605028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.606154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.606204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.607376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.607851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.609395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.609459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.610886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.614685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.614743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.616363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.616429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.618285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.618349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.619763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.619814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.622207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.622265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.623371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.623426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.624970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.625027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.625785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.625850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.628549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.628606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.629085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.629135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.630942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.631007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.631407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.631454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.634225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.634285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.635156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.635202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.636836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.636895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.637280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.637324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.639524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.639583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.641166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.641219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.642113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.642171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.642581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.642627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.644272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.644331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.645509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.645558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.646301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.646357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.646754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.620 [2024-05-14 12:05:14.646799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.648960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.649018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.650212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.650262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.651083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.651141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.651530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.651574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.654102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.654161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.655779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.655835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.656688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.656743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.657132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.657182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.659063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.659130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.659549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.659609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.660463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.660518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.660907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.660976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.662969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.663035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.663436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.663500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.663520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.663876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.664369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.664428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.664817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.664880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.664901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.665268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.666464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.666861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.666911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.667308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.667667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.667844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.668237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.668282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.668679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.669072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.670938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.671251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.672954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.673000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.673043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.673099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.673483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.674753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.674818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.674861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.674918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.675999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.676894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.676948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.676991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.677033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.677363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.677527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.677575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.621 [2024-05-14 12:05:14.677617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.677663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.678096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.679418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.679471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.679525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.679568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.680666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.681708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.681770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.681814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.681855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.682903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.683942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.684752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.685044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.686989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.687328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.688992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.689039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.689085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.689127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.689385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.690963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.691006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.691361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.692959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.693292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.694967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.695010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.622 [2024-05-14 12:05:14.695361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.850207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.851530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.852958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.854519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.857143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.858687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.860130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.860525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.862031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.863323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.864629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.866154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.868648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.870174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.871041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.871437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.873527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.875240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.876803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.878448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.881184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.882910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.883306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.883700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.885321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.886638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.888153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.889237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.891954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.892929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.893317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.893714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.895616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.897056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.898591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.899306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.902053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.902468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.902859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.904016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.905738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.907270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.908255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.909839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.911997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.912395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.912790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.914273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.916077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.917663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.918429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.919717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.921177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.921582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.922758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.924039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.926013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.926949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.928623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.930108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.931645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.932114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.932163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.933455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.935463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.936301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.936348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.937612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.938881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.939282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.939327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.940484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.940930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.941809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.941866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.943387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.944808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.945209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.945261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.946848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.947251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.947713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.947765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.948975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.950232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.951008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.951061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.952237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.952650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.883 [2024-05-14 12:05:14.953792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.953845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.955014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.956671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.958139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.958192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.959585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.960140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.961767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.961816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.963512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.964828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.965808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.965860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.967106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.967520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.968497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:47.884 [2024-05-14 12:05:14.968549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.969411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.970964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.972194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.972246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.972730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.973148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.974735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.974794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.975188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.976389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.977856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.977909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.979079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.979611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.980692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.980747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.981133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.982511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.983173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.983225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.984716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.985158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.985564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.985616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.986001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.987133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.988070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.988121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.989082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.989507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.989905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.989965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.990350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.991579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.993243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.993291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.994746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.995227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.995634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.995681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.996740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.998060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.999015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:14.999075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.000254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.000778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.001192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.001237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.002851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.004010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.005165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.005215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.005724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.006240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.007064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.007115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.146 [2024-05-14 12:05:15.008068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.009301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.010754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.010806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.011193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.011793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.013426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.013482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.014781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.016079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.016919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.016972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.017357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.017870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.019478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.019528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.021255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.022557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.023113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.023159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.023554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.024040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.024457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.024521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.024911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.026385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.027037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.027089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.027483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.027950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.028349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.028408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.028794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.030215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.030625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.030673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.031059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.031513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.031913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.031963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.032351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.033781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.034181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.034229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.034624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.035069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.035477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.035530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.035920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.037334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.037753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.037804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.038190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.038631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.039038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.039098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.039498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.040882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.041283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.041334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.041728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.042217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.042636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.042695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.043090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.044599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.045001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.045060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.045456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.045990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.046389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.046459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.046848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.048505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.048908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.048958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.049345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.049376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.049831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.050007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.050417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.050499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.050886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.050910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.051322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.052930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.052990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.053377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.053446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.053850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:48.147 [2024-05-14 12:05:15.055623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:48.147 [2024-05-14 12:05:15.055693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:48.147 [2024-05-14 12:05:15.057395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:48.147 [2024-05-14 12:05:15.057466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:28:48.147 [2024-05-14 12:05:15.058702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.058757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.058799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.058842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.059231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.059385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.059441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.059485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.059527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.060721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.060787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.060830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.060873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.061166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.061321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.061372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.061423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.061468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.062827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.062882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.062925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.062967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.063267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.063433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.063481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.063523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.063564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.064721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.064773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.064820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.064868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.065131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.065289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.065348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.065391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.065441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.066699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.066756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.066799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.066852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.067114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.067269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.067327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.067385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.067438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.068693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.068762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.068805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.068847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.069231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.069378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.069436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.069479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.069521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.070695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.070750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.070795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.070838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.071099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.071252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.071298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.071340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.071390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.072843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.072897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.072939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.072982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.073271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.073437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.073485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.073527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.073569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.074774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.076221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.076274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.076925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.077194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.077352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.078319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.078367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.079090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.080293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.080936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.080988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.082410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.082680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.082838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.083241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.083287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.083686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.147 [2024-05-14 12:05:15.084865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.086364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.086430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.088093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.088470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.088626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.089020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.089065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.089986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.091179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.092612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.092664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.093536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.093806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.093964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.094501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.094551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.095681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.096865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.097263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.097314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.098756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.099023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.099177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.100453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.100501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.100915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.102173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.103600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.103652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.104287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.104596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.104752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.106223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.106278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.107176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.108383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.108950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.108999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.110422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.110688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.110839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.111826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.111877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.113294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.114686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.116143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.116190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.116590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.116858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.117031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.118613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.118661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.119226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.120431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.121499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.121548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.122774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.123094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.123247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.124202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.124254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.125667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.126919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.127776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.127829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.129271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.129673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.129826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.130901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.130950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.131564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.136036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.137141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.137192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.137760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.138026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.138180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.138617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.138681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.140199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.141432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.142882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.142934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.144420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.144807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.144963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.146512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.146560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.147059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.148372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.149900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.149949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.150748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.151017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.151172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.152187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.152236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.152908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.156898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.158306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.158356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.159774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.160075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.160229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.161826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.161882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.163102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.166095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.167430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.167481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.168992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.169316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.169491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.170955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.171006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.172421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.176371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.176980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.177030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.178416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.178722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.178877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.180202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.180250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.181782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.185943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.186985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.187036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.187660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.187926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.188080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.188484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.188531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.190214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.193695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.195323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.195377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.197094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.197365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.197525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.198881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.198929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.199319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.202290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.203854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.203903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.204511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.204776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.204931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.206319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.206367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.207901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.211369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.212818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.212866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.214322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.214640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.214793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.216332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.216382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.217083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.221499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.221894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.221941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.223253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.223582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.148 [2024-05-14 12:05:15.223737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.149 [2024-05-14 12:05:15.225063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.149 [2024-05-14 12:05:15.225112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.149 [2024-05-14 12:05:15.226652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.230856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.231254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.231299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.231688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.231957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.232108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.233403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.233451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.234737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.238239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.239791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.239840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.239892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.240225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.240374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.240976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.241022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.241066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.243954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.245505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.245657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.245801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.247305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.251362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.252268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.253435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.254137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.254413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.254468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.255761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.257061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.258591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.263261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.264819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.265215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.266841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.267306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.269130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.270726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.272177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.273726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.278646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.279783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.280760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.281654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.281931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.283299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.284599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.286128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.287270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.289944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.291014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.292669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.293103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.293372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.293871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.295368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.296985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.298620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.301431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.303125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.304691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.305950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.306288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.307772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.308225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.309638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.310911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.316211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.317747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.318610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.319879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.320240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.321481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.322209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.323499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.324813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.330436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.331831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.333236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.333705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.333976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.334482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.336016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.337055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.337860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.342904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.343328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.344987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.346139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.346472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.348271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.349497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.350264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.351584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.356807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.358453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.359720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.359770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.360110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.361427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.362353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.363289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.363336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.367730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.367796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.368960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.369007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.369414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.371011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.371073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.371467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.371520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.375680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.375739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.377251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.377298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.377653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.378831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.378886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.380130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.380181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.385049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.385109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.386341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.386387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.386676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.387172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.387228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.388661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.388706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.393504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.393563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.394313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.394360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.394640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.395696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.395753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.396439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.396484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.400823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.400883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.401270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.401313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.401783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.403117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.403174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.404371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.404427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.408506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.408566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.409705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.409754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.410054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.411493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.411551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.412875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.412921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.416884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.416944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.417623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.417670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.417989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.419307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.419363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.420229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.420282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.425952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.426011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.426404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.426456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.426727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.428393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.428467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.429060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.429108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.410 [2024-05-14 12:05:15.433728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.433786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.434770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.434819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.435130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.436270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.436332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.437798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.437847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.440493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.440553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.442166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.442211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.442491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.443043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.443098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.444313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.444362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.448415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.448474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.449351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.449407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.449726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.450741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.450798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.452396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.452455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.455518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.455583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.456796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.456845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.457239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.458788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.458851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.460420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.460474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.465106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.465180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.465964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.466012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.466325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.467390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.467453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.468137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.468181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.471448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.471505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.472774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.472821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.473245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.473747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.473803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.474193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.474242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.477100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.477165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.478295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.478343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.478731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.480301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.480360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.480758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.480828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.484005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.484065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.484463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.484513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.484870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.486110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.486165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.486681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.486729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.489764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.489822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.491168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.491216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.491685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.492180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.492235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.492636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.411 [2024-05-14 12:05:15.492687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.495431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.495496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.496663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.496711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.497089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.498672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.498737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.499124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.499182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.502170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.502228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.502627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.502678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.503032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.504433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.504488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.504880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.504933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.507741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.507800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.509269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.509316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.509756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.510258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.510312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.510708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.510758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.513343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.513411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.514704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.514751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.515181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.516874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.516937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.517326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.517380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.520201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.520259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.521038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.521088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.521406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.522306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.522361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.523194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.523242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.526381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.527000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.528605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.528666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.673 [2024-05-14 12:05:15.528935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.529868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.530877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.531952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.532001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.536988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.537051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.537093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.537134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.537410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.539111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.539174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.539217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.539258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.543481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.543541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.543587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.543630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.543936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.544091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.544136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.544180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.544221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.546711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.546771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.546813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.546855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.547158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.547312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.547362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.547412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.547454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.551759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.555953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.556004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.556047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.556092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.558999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.559045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.563364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.563432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.563475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.563517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.563883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.564034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.564080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.564123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.564169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.567892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.567945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.567988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.568718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.572910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.572962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.573004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.573516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.573790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.573956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.574006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.574054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.575514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.578723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.579681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.579731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.580784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.581059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.581217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.582304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.582355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.583047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.584975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.674 [2024-05-14 12:05:15.586493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.586552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.587279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.587578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.587735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.589181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.589232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.590605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.594308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.595354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.595412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.597015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.597367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.597529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.598202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.598251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.599429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.603473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.604164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.604214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.605212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.605489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.605642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.606915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.606963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.607359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.609974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.611518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.611575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.612779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.613145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.613298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.614595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.614642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.615430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.618379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.619672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.619722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.621308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.621646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.621800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.622530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.622579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.624124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.627921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.629491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.629546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.631139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.631445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.631598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.632420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.632469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.633313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.636517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.637928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.637980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.639092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.639407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.639562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.640614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.640660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.641860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.645342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.646674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.646721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.647857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.648129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.648280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.648685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.648736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.650165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.654447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.655191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.655238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.656435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.656708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.656861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.658409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.658457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.659309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.661205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.662831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.662887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.664541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.664813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.664964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.666040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.666089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.667388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.671669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.673240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.673288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.673826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.674098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.674252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.675 [2024-05-14 12:05:15.675580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.675628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.677159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.680713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.681649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.681698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.683111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.683502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.683657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.684728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.684775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.685789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.689927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.691239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.691288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.692426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.692701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.692851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.693665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.693715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.694867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.698718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.700281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.700331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.701471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.701744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.701894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.703292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.703340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.704760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.707233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.708060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.708110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.709393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.709734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.709884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.711452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.711499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.712224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.716338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.717572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.717620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.718655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.718994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.719147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.720357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.720411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.721689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.725836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.727147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.727197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.728792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.729192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.729346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.730136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.730185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.731690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.735739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.736439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.736489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.737822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.738094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.738248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.739800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.739847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.741069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.743607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.744886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.744935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.746236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.746512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.746667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.747461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.747586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.749148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.753650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.754754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.754801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.755747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.756054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.756204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.676 [2024-05-14 12:05:15.757529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.757577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.759100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.763161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.763939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.763987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.765066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.765363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.765518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.766297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.766342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.767592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.771412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.771471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.771519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.773112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.773409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.773561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.773631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.773677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.775011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.777663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.778946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.780258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.781806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.782174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.782334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.783844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.785466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.787077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.790391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.791696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.793011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.794546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.794820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.796108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.797391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.798696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.800230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.806859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.808429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.809867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.811414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.811760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.813154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.814530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.816091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.817565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.821869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.823088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.824379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.825655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.825974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.827286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.828284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.829926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.830346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.834155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.835347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.836318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.837933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.938 [2024-05-14 12:05:15.838349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.839947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.840350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.842022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.843428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.847522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.848785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.849178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.850816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.851089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.851634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.853245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.854782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.855648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.859342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.859758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.861287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.862881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.863258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.864414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.865457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.866287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.867311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.872279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.873134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.874341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.875010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.875285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.876322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.877546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.878778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.879726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.884595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.885566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.886595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.886646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.886920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.888080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.888833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.890303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.890351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.895116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.895175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.896504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.896556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.896831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.897895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.897952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.898636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.898682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.902324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.902382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.903329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.903378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.903676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.905415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.905477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.905993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.906039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.910448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.910515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.912102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.912160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.912506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.913549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.913607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.914966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.915013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.919839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.919897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.920825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.920898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.921172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.921741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.921798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.922979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.923023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.927288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.927347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.928184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.928234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.928574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.929841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.929897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.930365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.930416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.934861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.934920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.936575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.936621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.937009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.938490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.938546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.939 [2024-05-14 12:05:15.939249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.939299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.944286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.944345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.945260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.945306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.945587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.946297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.946353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.947440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.947490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.951872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.951931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.952322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.952366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.952644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.953872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.953927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.955460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.955507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.961192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.961248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.962277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.962324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.962716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.963619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.963676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.964901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.964949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.968873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.968931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.970578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.970626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.971030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.971536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.971599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.971987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.972035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.974440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.974498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.976140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.976185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.976562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.978034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.978088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.978486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.978543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.980872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.980945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.981334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.981384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.981799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.983568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.983629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.984120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.984168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.986583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.986644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.987033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.987082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.987463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.987961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.988021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.988433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.988483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.990955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.991019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.991422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.991472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.991835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.992328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.992383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.992784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.992839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.995475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.995534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.995925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.995974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.996337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.996844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.996906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.997296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:15.997347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.000239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.000304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.000702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.000761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.001176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.001687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.001750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.002139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.002194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.005179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.005238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.006295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.006345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.940 [2024-05-14 12:05:16.006622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.007725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.007781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.008396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.008452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.012807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.012871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.014287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.014336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.014752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.015245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.015301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.015895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.015941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.020606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.020666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.021056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.021105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.021378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.022487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:48.941 [2024-05-14 12:05:16.022543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.023814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.023864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.027321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.027380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.028236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.028285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.028619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.030247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.030309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.031459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.031508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.035584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.035666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.037188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.037245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.037523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.038112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.038166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.038558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.038605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.043160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.043224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.044469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.044516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.044943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.046593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.046655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.047045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.047100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.051570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.053009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.053601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.053651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.053924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.054430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.055980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.057605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.057663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.061312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.061371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.061420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.061462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.061766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.063003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.063057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.063101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.203 [2024-05-14 12:05:16.063155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.066861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.068962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.069004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.069048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.070507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.070559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.070608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.070658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.071005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.071154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.071200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.071242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.071285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.072821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.072873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.072915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.072960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.073265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.073427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.073474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.073516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.073562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.074897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.074949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.074998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.075619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.077991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.079963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.080005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.080048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.081359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.081419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.081475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.083107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.083394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.083551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.083613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.204 [2024-05-14 12:05:16.083658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.084048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.085424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.086601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.086653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.087359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.087638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.087796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.089359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.089424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.090997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.092347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.093684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.093733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.094311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.094591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.094750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.095142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.095203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.095598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.096868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.097265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.097310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.097708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.097984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.098142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.099612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.099658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.100373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.101692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.102092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.102141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.103588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.103862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.104018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.105617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.105669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.107114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.108364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.109747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.109798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.110626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.111003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.111156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.111972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.112022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.113298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.114580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.116133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.116189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.117766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.118067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.118223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.119708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.119760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.120153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.121528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.122813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.122863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.124168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.124485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.124641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.125943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.125991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.127303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.128512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.128910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.128965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.130500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.130888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.131044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.205 [2024-05-14 12:05:16.132362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.132423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.133737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.134925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.136242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.136290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.137591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.137937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.138087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.139164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.139212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.140205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.143815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.145206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.145255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.146569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.146842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.146995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.148424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.148473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.149901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.152509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.153839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.153889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.155190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.155467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.155619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.157131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.157189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.158738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.159973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.160368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.160424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.161364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.161680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.161831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.163140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.163189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.164496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.165702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.167020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.167070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.168402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.168717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.168870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.169264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.169311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.170546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.171737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.172905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.172954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.174510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.174781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.174933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.176580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.176635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.178356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.179616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.180923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.180972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.182263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.182630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.182785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.183627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.183677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.184963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.186147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.186576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.186624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.187010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.187283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.206 [2024-05-14 12:05:16.187443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.188858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.188907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.190373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.191611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.193042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.193092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.194448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.194723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.194874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.195264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.195327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.195721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.196955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.198282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.198331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.199147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.199450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.199604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.200922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.200971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.202274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.203685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.205394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.205456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.207114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.207387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.207552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.209157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.209226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.210392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.211568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.212837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.212895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.213280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.213690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.213843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.215158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.215206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.216524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.217719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.219015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.219063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.220355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.220676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.220829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.221740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.221790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.222174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.223420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.224053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.224102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.225138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.225420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.225574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.225984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.226036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.226444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.227654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.227711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.227755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.229336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.229655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.229807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.229858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.229908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.230311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.231522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.232943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.233993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.234948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.235221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.235377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.236910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.237305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.239018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.243914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.244931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.245330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.245727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.246004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.247199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.207 [2024-05-14 12:05:16.248463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.249336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.250881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.255497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.256636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.257332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.259058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.259369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.260452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.261578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.262510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.263453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.265179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.266712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.268134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.268582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.268919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.269615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.270842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.272567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.273288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.274865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.275588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.276711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.278281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.278602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.279797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.281307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.281705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.282092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.284017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.285018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.286438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.208 [2024-05-14 12:05:16.286827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.287269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.288522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.289486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.290618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.291932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.293659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.294921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.295898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.296951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.297264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.298350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.299407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.299798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.300188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.302739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.303732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.304623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.305012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.305387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.307145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.308354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.308984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.310604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.312314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.314011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.314414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.314465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.314740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.315241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.315648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.316192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.316241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.318960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.319026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.320332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.320384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.320732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.321226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.321279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.322123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.322174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.324211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.324270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.325564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.325611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.325917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.326931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.326989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.327372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.327423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.329301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.329367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.330326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.330375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.330655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.331462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.331518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.331905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.331949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.334676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.334742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.335145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.335194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.335518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.336015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.336078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.336484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.336542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.338232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.338296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.338701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.338751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.471 [2024-05-14 12:05:16.339085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.339596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.339658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.340052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.340107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.342714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.342792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.343175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.343230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.343585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.344084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.344145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.344541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.344587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.346394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.346459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.346855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.346902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.347262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.347771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.347828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.348227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.348277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.350101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.350160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.350552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.350596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.350964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.351468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.351526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.351912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.351955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.354487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.354553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.354948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.354998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.355341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.355843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.355897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.356287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.356341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.358154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.358219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.359902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.359960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.360320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.362050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.362112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.362511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.362557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.365127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.365187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.366651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.366699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.367047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.367959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.368017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.368410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.368454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.370135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.370196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.371343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.371396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.371676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.372171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.372224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.372619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.372664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.375241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.375303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.376435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.376486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.376829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.377321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.377374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.378162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.378209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.380624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.380690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.382090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.382140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.382523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.383020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.383072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.384379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.472 [2024-05-14 12:05:16.384438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.386902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.386963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.387591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.387637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.387975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.388474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.388531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.389767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.389817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.392531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.392591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.392978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.393021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.393488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.394832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.394888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.396098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.396149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.398072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.398130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.398524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.398568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.398861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.400297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.400354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.401693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.401738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.403249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.403306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.403717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.403762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.404087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.405409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.405466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.406004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.406054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.407652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.407711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.408169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.408216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.408499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.410179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.410236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.411156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.411206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.412799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.412856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.414215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.414266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.414596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.415534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.415597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.417228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.417283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.418999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.419059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.420437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.420492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.420764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.421553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.421612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.422807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.422857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.425386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.425453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.426642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.426691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.427036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.428743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.428807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.430353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.430417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.433034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.434641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.435260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.435309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.435634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.437139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.437542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.437929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.437976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.440147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.440206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.440253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.440299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.440580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.442335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.442396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.442447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.442490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.473 [2024-05-14 12:05:16.443824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.443877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.443932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.443975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.444247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.444409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.444471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.444525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.444568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.445862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.445916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.445959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.446751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.447890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.447947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.447990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.448670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.449892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.449944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.449986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.450767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.452968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.454961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.456937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.458161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.458213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.458256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.459634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.459909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.460061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.460107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.460150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.460554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.461795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.463123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.463173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.464696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.464983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.465138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.466816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.466872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.468594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.469811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.470209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.470254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.471332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.471679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.471833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.473170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.473220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.474752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.476007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.477508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.477566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.479175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.479581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.479737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.480130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.480175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.481769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.474 [2024-05-14 12:05:16.482981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.483641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.483691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.484982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.485254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.485416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.486959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.487009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.488143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.489596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.491004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.491053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.492507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.492782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.492929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.493791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.493841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.495152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.496314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.496725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.496774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.497160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.497438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.497595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.499135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.499189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.500718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.501916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.503254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.503304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.504838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.505155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.505306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.505709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.505755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.506504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.507660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.509199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.509249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.510495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.510809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.510959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.512292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.512342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.513878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.515414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.516722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.516769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.518088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.518364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.518523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.519607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.519656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.521368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.522555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.523080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.523126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.523519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.523820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.523971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.525269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.525317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.475 [2024-05-14 12:05:16.526625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.527760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.529078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.529129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.530452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.530730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.530882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.531277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.531322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.531718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.532872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.534438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.534489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.535332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.535618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.535770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.537447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.537508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.539136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.540392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.541532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.541583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.542852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.543168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.543317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.544884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.544933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.545540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.546703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.548414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.548463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.548849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.549268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.549428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.550892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.550941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.552427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.476 [2024-05-14 12:05:16.553670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.555099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.555150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.556527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.556802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.556953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.558281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.558329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.558737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.560165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.561003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.561057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.562617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.562943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.563095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.563502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.563555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.563938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.565090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.566009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.566061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.567028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.567301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.567467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.567862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.567907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.568294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.569523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.571232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.571278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.572673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.573025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.573182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.573582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.573628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.574650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.738 [2024-05-14 12:05:16.575966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.576937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.576989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.578194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.578583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.578744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.579134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.579179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.580771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.581973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.583508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.583568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.585155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.585434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.585593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.585984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.586031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.586423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.587581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.588525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.588577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.589747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.590047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.590208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.590609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.590656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.591044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.592295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.593983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.594039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.595627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.596029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.596182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.596585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.596632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.597645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.598840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.598897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.598939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.600143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.600471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.600627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.600679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.600721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.601107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.602267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.602771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.604117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.605717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.606104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.606260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.606662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.607696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.608887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.611423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.611821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.612208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.613297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.613628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.614812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.615801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.616987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.618098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.620739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.621866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.622909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.624114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.624410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.624906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.625299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.626704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.627898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.630266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.630675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.631061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.632521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.632834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.633670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.635178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.636419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.637013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.638800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.640176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.640656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.641043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.641473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.643180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.644407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.645035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.646583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.648475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.649961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.651281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.652782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.653187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.654947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.739 [2024-05-14 12:05:16.656493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.658093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.658497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.660196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.661446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.662055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.662450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.662778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.664157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.665582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.667143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.668608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.671341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.672782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.673173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.673564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.673882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.675218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.676097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.677305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.678531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.680359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.680768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.681162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.681219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.681686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.682194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.682603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.682999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.683061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.684878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.684941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.685326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.685369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.685746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.686244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.686299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.686692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.686750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.688599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.688657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.689042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.689085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.689466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.689958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.690015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.690422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.690481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.692300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.692358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.692753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.692798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.693181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.693687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.693743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.694131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.694175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.695974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.696031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.696428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.696472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.696866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.697365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.697427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.697829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.697875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.699709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.699767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.700172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.700215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.700586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.701620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.701681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.702766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.702816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.705874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.705939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.707450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.707505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.707830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.709328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.709389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.710772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.710817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.713472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.713531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.714481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.714526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.714840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.716163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.716220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.716722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.716773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.719500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.719567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.720165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.720213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.740 [2024-05-14 12:05:16.720525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.722038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.722092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.722483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.722527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.724853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.724912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.726395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.726452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.726731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.727356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.727419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.727808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.727856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.730515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.730573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.730985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.731033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.731388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.732457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.732514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.733465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.733514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.735944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.736004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.736404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.736455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.736874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.738524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.738592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.740150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.740204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.741869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.741927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.742314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.742358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.742641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.743711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.743767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.744785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.744835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.746786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.746849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.748001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.748049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.748429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.749660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.749717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.751281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.751335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.752993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.753050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.754673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.754716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.755030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.755665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.755727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.756902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.756954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.758873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.758932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.759886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.759936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.760208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.761433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.761490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.762437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.762485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.765157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.765216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.766303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.766351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.766733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.768448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.768513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.770008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.770056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.772904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.772968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.774663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.774720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.775036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.776142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.776199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.777476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.777526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.779730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.779793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.780866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.780916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.781188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.782255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.782311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.782972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.783023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.785956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.741 [2024-05-14 12:05:16.786014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.787382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.787435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.787803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.789215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.789271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.790456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.790506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.792561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.792620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.793568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.793617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.793894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.795371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.795440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.796578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.796626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.798672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.798731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.800244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.800291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.800664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.801830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.801886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.803408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.803454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.806350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.806418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.806939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.806987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.807269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.807784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.807840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.808228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.808276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.810759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.811293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.811691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.811741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.812062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.813376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.815001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.815397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.815451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.818321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.818385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.818461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.818505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.818858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.820311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.820366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.820414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.820461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.821819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.821871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.821913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:49.742 [2024-05-14 12:05:16.821955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.822288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.822453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.822500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.822542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.822591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.823797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.823850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.823893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.823934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.824270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.824431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.824479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.824521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.824569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.825772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.825824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.825867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.825909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.826345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.826506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.826558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.826600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.826641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.827780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.037 [2024-05-14 12:05:16.827832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.827883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.827930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.828203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.828357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.828414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.828461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.828506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.829640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.829698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.829747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.829790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.830059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.830210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.830269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.830313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.830357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.831717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.831773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.831815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.831857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.832200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.832355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.832407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.832451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.832493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.833678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.833735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.833776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.833818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.834119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.834270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.834321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.834363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.834410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.835948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.836001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.836044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.836863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.837189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.837343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.837389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.837438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.838749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.839920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.841379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.841432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.842915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.843243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.843407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.843805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.843856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.844245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.845405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.846708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.846757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.847860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.848134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.848290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.849916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.849971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.851607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.852901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.853736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.853785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.855075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.855418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.855574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.856895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.856945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.857873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.859073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.860392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.860445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.860885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.861193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.861346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.862343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.862392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.863688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.864860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.866357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.866413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.867911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.868253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.868412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.869801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.869847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.870233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.871453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.872755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.872803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.874093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.874370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.874532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.038 [2024-05-14 12:05:16.876151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.876198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.877765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.878992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.879392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.879449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.880463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.880780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.880931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.882248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.882296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.883605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.884822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.886154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.886203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.887565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.887898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.888052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.888457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.888510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.889697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.890861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.892162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.892209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.893886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.894170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.894323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.895852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.895909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.897500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.898770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.900082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.900133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.901436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.901762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.901913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.902849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.902898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.904275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.905512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.905914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.905964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.906354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.906678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.906831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.907811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.907863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.909000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.910276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.910696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.910754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.911142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.911421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.911574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.912831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.912881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.913360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.914654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.915054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.915104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.916069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.916386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.916545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.917744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.917793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.919215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.920931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.921333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.921387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.922860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.923172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.923325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.923901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.923952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.925033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.926256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.927156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.927209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.928505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.928820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.928974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.930292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.930344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.931682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.933542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.935011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.935069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.936700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.937028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.937182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.938252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.938305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.939647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.039 [2024-05-14 12:05:16.940909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.941876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.941928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.942940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.943214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.943370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.944464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.944515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.945074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.946556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.948066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.948125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.948830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.949118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.949282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.950760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.950823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.951210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.952530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.953685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.953737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.955211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.955581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.955735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.956473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.956530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.956917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.958260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.958786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.958837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.959966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.960239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.960394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.960800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.960850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.961237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.962482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.963732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.963781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.964725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.965032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.965184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.965589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.965648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.966034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.967315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.968984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.969031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.970450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.970804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.970957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.971348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.971405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.972256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.973568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.973624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.973666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.974631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.974914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.975067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.975118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.975178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.975579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.977012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.977686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.979314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.980565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.980970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.981121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.981527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.982003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.983388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.985663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.986436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.986830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.987224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.987583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.988246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.989479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.990090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.990489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.993088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.993583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.995049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.996683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.040 [2024-05-14 12:05:16.996960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:16.997467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:16.997868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:16.998264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:16.999824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.002411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.002957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.003352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.004018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.004294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.005800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.007297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.008939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.010046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.012193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.012608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.013006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.014229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.014511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.015407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.016153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.017701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.018696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.020601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.021956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.022467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.022861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.023268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.023768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.024169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.024570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.024962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.026770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.027171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.027576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.027983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.028470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.028977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.029375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.029774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.030167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.031898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.032300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.032703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.033095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.033505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.034012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.034421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.034812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.035204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.036822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.037226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.037630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.037681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.037988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.038491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.038891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.039287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.039336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.041185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.041245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.041656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.041708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.042049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.043352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.043418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.043976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.044026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.045697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.045757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.046145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.046193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.046471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.048230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.048284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.049027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.049077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.050691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.050751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.051835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.051885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.052216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.053214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.053287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.054832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.054889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.056758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.056822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.058528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.058583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.058865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.059361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.059424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.041 [2024-05-14 12:05:17.060699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.060748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.062391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.062455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.063575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.063626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.063941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.065135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.065193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.065588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.065639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.068165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.068224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.069797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.069850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.070156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.070950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.071005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.071394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.071447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.073165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.073224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.074309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.074361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.074638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.075130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.075188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.075588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.075638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.078011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.078070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.079023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.079073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.079409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.079905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.079966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.080370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.080423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.083192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.083261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.084822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.084876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.085249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.085751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.085808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.086786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.086834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.089023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.089082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.090072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.090135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.090586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.042 [2024-05-14 12:05:17.091088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.091141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.092498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.092546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.095510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.095573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.096595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.096643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.096921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.097426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.097484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.097877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.097927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.099842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.099902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.101259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.101307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.101680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.102831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.102897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.103290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.103346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.105464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.105526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.106986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.107050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.334 [2024-05-14 12:05:17.107324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.107827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.107883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.108272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.108320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.110418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.110478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.111440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.111490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.111768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.112263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.112324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.112729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.112779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.114635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.114695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.115947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.115999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.116272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.117949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.118004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.118395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.118450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.120943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.121002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.122209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.122257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.122569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.123359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.123422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.123814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.123863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.126784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.126870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.127259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.127309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.127662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.128300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.128354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.129651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.129700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.132196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.132254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.133560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.133608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.133928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.134431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.134497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.134887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.134936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.136927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.136987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.138358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.138414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.138725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.140388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.140456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.140989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.141039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.142599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.142659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.143048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.143097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.143490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.143988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.144045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.144446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.144495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.146189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.146251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.147796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.147847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.148119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.148629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.148686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.149074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.149122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.151872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.151931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.152866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.152912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.153188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.154807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.154874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.156520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.156585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.158667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.159953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.161253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.161302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.161581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.162732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.164287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.165659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.165708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.335 [2024-05-14 12:05:17.167407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.167465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.167507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.167550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.167825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.169210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.169267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.169309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.169350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.170688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.170740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.170781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.170822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.171137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.171291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.171337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.171380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.171430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.172755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.172808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.172852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.172895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.173305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.173472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.173519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.173581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.173625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.174818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.174874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.174916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.174958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.175246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.175407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.175457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.175503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.175545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.176739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.176792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.176834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.176881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.177202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.177357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.177412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.177462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.177509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.178744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.178797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.178839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.178880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.179191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.179342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.179388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.179436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.179486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.180765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.180818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.180860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.180902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.181211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.181363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.181416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.181466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.181513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.182766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.182818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.182860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.182902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.183174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.183327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.183384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.183434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.183476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.184688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.184740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.184787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.186412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.186710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.186862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.186916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.186959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.188551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.189818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.190953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.191002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.192288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.192600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.192755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.194305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.194355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.194984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.196188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.197617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.197666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.198055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.198452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.198608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.336 [2024-05-14 12:05:17.200096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.200148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.201663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.203021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.204291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.204339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.205640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.205915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.206073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.207220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.207272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.207668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.209057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.210380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.210434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.212008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.212410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.212565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.213874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.213923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.215216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.216506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.216906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.216954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.218620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.218894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.219053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.220626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.220681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.222253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.223499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.225033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.225082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.225963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.226441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.226609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.227009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.227066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.228645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.230006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.231495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.231549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.231940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.232235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.232388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.233013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.233062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.234238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.235470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.236679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.236727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.237137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.237585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.237743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.239078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.239129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.240376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.241747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.242222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.242272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.242668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.242983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.243136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.244488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.244542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.245952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.247213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.247627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.247680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.248082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.248444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.248600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.249894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.249943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.250497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.251824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.252226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.252276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.252955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.253230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.253386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.254923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.254975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.256068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.257392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.257800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.257851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.259280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.259664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.259817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.260611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.260665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.262194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.263883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.264322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.264370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.337 [2024-05-14 12:05:17.265627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.265903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.266057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.266851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.266906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.267862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.269136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.270315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.270366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.271316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.271599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.271756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.273356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.273418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.274713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.276491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.277835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.277888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.279511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.279869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.280025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.280978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.281031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.282264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.283589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.285104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.285162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.286361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.286718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.286877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.287273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.287324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.287722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.288978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.338 [2024-05-14 12:05:17.289059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:28:50.597 00:28:50.597 Latency(us) 00:28:50.597 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:50.597 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:50.597 Verification LBA range: start 0x0 length 0x100 00:28:50.597 crypto_ram : 5.79 44.25 2.77 0.00 0.00 2810361.77 70664.90 2640587.91 00:28:50.597 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:50.597 Verification LBA range: start 0x100 length 0x100 00:28:50.597 crypto_ram : 5.77 44.37 2.77 0.00 0.00 2796237.69 76135.74 2553054.61 00:28:50.597 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:50.597 Verification LBA range: start 0x0 length 0x100 00:28:50.597 crypto_ram2 : 5.79 44.24 2.77 0.00 0.00 2713786.99 70209.00 2640587.91 00:28:50.598 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:50.598 Verification LBA range: start 0x100 length 0x100 00:28:50.598 crypto_ram2 : 5.77 44.36 2.77 0.00 0.00 2701089.39 75679.83 2553054.61 00:28:50.598 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:50.598 Verification LBA range: start 0x0 length 0x100 00:28:50.598 crypto_ram3 : 5.55 281.46 17.59 0.00 0.00 408176.58 31457.28 594497.00 00:28:50.598 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:50.598 Verification LBA range: start 0x100 length 0x100 00:28:50.598 crypto_ram3 : 5.56 292.30 18.27 0.00 0.00 393189.64 49009.53 590849.78 00:28:50.598 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:50.598 Verification LBA range: start 0x0 length 0x100 00:28:50.598 crypto_ram4 : 5.65 297.97 18.62 0.00 0.00 374502.36 15728.64 521552.58 00:28:50.598 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:50.598 Verification LBA range: start 0x100 length 0x100 00:28:50.598 crypto_ram4 : 5.66 309.00 19.31 0.00 0.00 361504.17 13506.11 514258.14 00:28:50.598 =================================================================================================================== 00:28:50.598 Total : 1357.95 84.87 0.00 0.00 701372.86 13506.11 2640587.91 00:28:51.166 00:28:51.166 real 0m8.994s 00:28:51.166 user 0m17.039s 00:28:51.166 sys 0m0.454s 00:28:51.166 12:05:18 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:51.166 12:05:18 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:51.166 ************************************ 00:28:51.166 END TEST bdev_verify_big_io 00:28:51.166 ************************************ 00:28:51.167 12:05:18 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:51.167 12:05:18 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:51.167 12:05:18 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:51.167 12:05:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:51.167 ************************************ 00:28:51.167 START TEST bdev_write_zeroes 00:28:51.167 ************************************ 00:28:51.167 12:05:18 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:51.167 [2024-05-14 12:05:18.208349] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:51.167 [2024-05-14 12:05:18.208413] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1832311 ] 00:28:51.426 [2024-05-14 12:05:18.336291] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:51.426 [2024-05-14 12:05:18.437742] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:51.426 [2024-05-14 12:05:18.459037] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:51.426 [2024-05-14 12:05:18.467044] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:51.426 [2024-05-14 12:05:18.475062] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:51.685 [2024-05-14 12:05:18.588170] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:54.219 [2024-05-14 12:05:20.832614] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:54.219 [2024-05-14 12:05:20.832679] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:54.219 [2024-05-14 12:05:20.832694] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.219 [2024-05-14 12:05:20.840635] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:54.219 [2024-05-14 12:05:20.840654] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:54.219 [2024-05-14 12:05:20.840666] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.219 [2024-05-14 12:05:20.848653] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:54.219 [2024-05-14 12:05:20.848670] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:54.219 [2024-05-14 12:05:20.848682] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.219 [2024-05-14 12:05:20.856674] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:54.219 [2024-05-14 12:05:20.856691] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:54.219 [2024-05-14 12:05:20.856703] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.219 Running I/O for 1 seconds... 00:28:55.156 00:28:55.156 Latency(us) 00:28:55.156 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:55.156 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:55.156 crypto_ram : 1.03 1973.54 7.71 0.00 0.00 64362.02 5470.83 77503.44 00:28:55.156 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:55.156 crypto_ram2 : 1.03 1979.26 7.73 0.00 0.00 63818.18 5413.84 72032.61 00:28:55.156 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:55.156 crypto_ram3 : 1.02 15121.29 59.07 0.00 0.00 8329.09 2478.97 10770.70 00:28:55.156 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:55.156 crypto_ram4 : 1.02 15158.47 59.21 0.00 0.00 8282.35 2478.97 8776.13 00:28:55.156 =================================================================================================================== 00:28:55.156 Total : 34232.55 133.72 0.00 0.00 14775.91 2478.97 77503.44 00:28:55.418 00:28:55.418 real 0m4.220s 00:28:55.418 user 0m3.776s 00:28:55.418 sys 0m0.400s 00:28:55.418 12:05:22 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:55.418 12:05:22 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:55.418 ************************************ 00:28:55.418 END TEST bdev_write_zeroes 00:28:55.418 ************************************ 00:28:55.418 12:05:22 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:55.418 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:55.418 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:55.418 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:55.418 ************************************ 00:28:55.418 START TEST bdev_json_nonenclosed 00:28:55.418 ************************************ 00:28:55.418 12:05:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:55.678 [2024-05-14 12:05:22.530184] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:55.678 [2024-05-14 12:05:22.530246] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1832859 ] 00:28:55.678 [2024-05-14 12:05:22.657033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:55.678 [2024-05-14 12:05:22.757531] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:55.678 [2024-05-14 12:05:22.757600] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:55.678 [2024-05-14 12:05:22.757620] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:55.678 [2024-05-14 12:05:22.757633] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:55.936 00:28:55.936 real 0m0.392s 00:28:55.936 user 0m0.233s 00:28:55.936 sys 0m0.156s 00:28:55.936 12:05:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:55.936 12:05:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:55.936 ************************************ 00:28:55.936 END TEST bdev_json_nonenclosed 00:28:55.936 ************************************ 00:28:55.936 12:05:22 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:55.936 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:55.936 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:55.936 12:05:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:55.936 ************************************ 00:28:55.936 START TEST bdev_json_nonarray 00:28:55.936 ************************************ 00:28:55.936 12:05:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:55.936 [2024-05-14 12:05:23.005154] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:55.936 [2024-05-14 12:05:23.005212] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1832880 ] 00:28:56.194 [2024-05-14 12:05:23.135020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:56.194 [2024-05-14 12:05:23.242260] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.194 [2024-05-14 12:05:23.242330] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:56.194 [2024-05-14 12:05:23.242351] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:56.194 [2024-05-14 12:05:23.242364] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:56.452 00:28:56.452 real 0m0.395s 00:28:56.452 user 0m0.238s 00:28:56.452 sys 0m0.154s 00:28:56.452 12:05:23 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:56.452 12:05:23 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:56.452 ************************************ 00:28:56.452 END TEST bdev_json_nonarray 00:28:56.452 ************************************ 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:28:56.452 12:05:23 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:28:56.452 00:28:56.452 real 1m11.851s 00:28:56.452 user 2m38.676s 00:28:56.452 sys 0m8.941s 00:28:56.452 12:05:23 blockdev_crypto_aesni -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:56.452 12:05:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:56.452 ************************************ 00:28:56.452 END TEST blockdev_crypto_aesni 00:28:56.452 ************************************ 00:28:56.452 12:05:23 -- spdk/autotest.sh@354 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:28:56.452 12:05:23 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:56.452 12:05:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:56.452 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:28:56.452 ************************************ 00:28:56.452 START TEST blockdev_crypto_sw 00:28:56.452 ************************************ 00:28:56.452 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:28:56.711 * Looking for test storage... 00:28:56.711 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1833114 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:56.711 12:05:23 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1833114 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@827 -- # '[' -z 1833114 ']' 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:56.711 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:56.711 12:05:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:56.711 [2024-05-14 12:05:23.666488] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:56.712 [2024-05-14 12:05:23.666559] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1833114 ] 00:28:56.712 [2024-05-14 12:05:23.789989] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:56.970 [2024-05-14 12:05:23.893426] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:57.538 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:57.538 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # return 0 00:28:57.538 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:57.538 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:28:57.538 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:28:57.538 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:57.538 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:57.797 Malloc0 00:28:57.797 Malloc1 00:28:57.797 true 00:28:57.797 true 00:28:57.797 true 00:28:57.797 [2024-05-14 12:05:24.868884] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:57.797 crypto_ram 00:28:57.797 [2024-05-14 12:05:24.876912] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:57.797 crypto_ram2 00:28:58.056 [2024-05-14 12:05:24.884934] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:58.056 crypto_ram3 00:28:58.056 [ 00:28:58.056 { 00:28:58.056 "name": "Malloc1", 00:28:58.056 "aliases": [ 00:28:58.056 "8ba80f68-75d1-47d2-aaa3-94e346a90f60" 00:28:58.056 ], 00:28:58.056 "product_name": "Malloc disk", 00:28:58.056 "block_size": 4096, 00:28:58.056 "num_blocks": 4096, 00:28:58.056 "uuid": "8ba80f68-75d1-47d2-aaa3-94e346a90f60", 00:28:58.056 "assigned_rate_limits": { 00:28:58.056 "rw_ios_per_sec": 0, 00:28:58.056 "rw_mbytes_per_sec": 0, 00:28:58.056 "r_mbytes_per_sec": 0, 00:28:58.056 "w_mbytes_per_sec": 0 00:28:58.056 }, 00:28:58.056 "claimed": true, 00:28:58.056 "claim_type": "exclusive_write", 00:28:58.056 "zoned": false, 00:28:58.056 "supported_io_types": { 00:28:58.056 "read": true, 00:28:58.056 "write": true, 00:28:58.056 "unmap": true, 00:28:58.056 "write_zeroes": true, 00:28:58.056 "flush": true, 00:28:58.056 "reset": true, 00:28:58.056 "compare": false, 00:28:58.056 "compare_and_write": false, 00:28:58.056 "abort": true, 00:28:58.056 "nvme_admin": false, 00:28:58.056 "nvme_io": false 00:28:58.056 }, 00:28:58.056 "memory_domains": [ 00:28:58.056 { 00:28:58.056 "dma_device_id": "system", 00:28:58.056 "dma_device_type": 1 00:28:58.056 }, 00:28:58.056 { 00:28:58.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.056 "dma_device_type": 2 00:28:58.056 } 00:28:58.056 ], 00:28:58.056 "driver_specific": {} 00:28:58.056 } 00:28:58.056 ] 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:58.056 12:05:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.056 12:05:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ff2f0007-58cd-5e70-be54-5140b34c1ba2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff2f0007-58cd-5e70-be54-5140b34c1ba2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:58.056 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1833114 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@946 -- # '[' -z 1833114 ']' 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # kill -0 1833114 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # uname 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:58.056 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1833114 00:28:58.315 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:58.315 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:58.315 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1833114' 00:28:58.315 killing process with pid 1833114 00:28:58.315 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@965 -- # kill 1833114 00:28:58.315 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@970 -- # wait 1833114 00:28:58.575 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:58.575 12:05:25 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:58.575 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:28:58.575 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:58.575 12:05:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:58.575 ************************************ 00:28:58.575 START TEST bdev_hello_world 00:28:58.575 ************************************ 00:28:58.575 12:05:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:58.575 [2024-05-14 12:05:25.651749] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:58.575 [2024-05-14 12:05:25.651796] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1833319 ] 00:28:58.835 [2024-05-14 12:05:25.763047] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:58.835 [2024-05-14 12:05:25.861903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.094 [2024-05-14 12:05:26.024315] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:59.094 [2024-05-14 12:05:26.024380] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:59.094 [2024-05-14 12:05:26.024396] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.094 [2024-05-14 12:05:26.032331] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:59.094 [2024-05-14 12:05:26.032352] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:59.094 [2024-05-14 12:05:26.032364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.094 [2024-05-14 12:05:26.040353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:59.094 [2024-05-14 12:05:26.040372] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:59.094 [2024-05-14 12:05:26.040384] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.094 [2024-05-14 12:05:26.080817] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:59.094 [2024-05-14 12:05:26.080856] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:59.094 [2024-05-14 12:05:26.080875] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:59.094 [2024-05-14 12:05:26.082856] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:59.094 [2024-05-14 12:05:26.082935] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:59.094 [2024-05-14 12:05:26.082956] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:59.094 [2024-05-14 12:05:26.082990] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:59.094 00:28:59.094 [2024-05-14 12:05:26.083008] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:59.353 00:28:59.353 real 0m0.690s 00:28:59.353 user 0m0.467s 00:28:59.353 sys 0m0.208s 00:28:59.353 12:05:26 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:59.353 12:05:26 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:59.353 ************************************ 00:28:59.353 END TEST bdev_hello_world 00:28:59.353 ************************************ 00:28:59.353 12:05:26 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:59.354 12:05:26 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:59.354 12:05:26 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:59.354 12:05:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:59.354 ************************************ 00:28:59.354 START TEST bdev_bounds 00:28:59.354 ************************************ 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1833498 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1833498' 00:28:59.354 Process bdevio pid: 1833498 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1833498 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 1833498 ']' 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:59.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:59.354 12:05:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:59.354 [2024-05-14 12:05:26.433914] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:28:59.354 [2024-05-14 12:05:26.433974] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1833498 ] 00:28:59.613 [2024-05-14 12:05:26.560983] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:59.613 [2024-05-14 12:05:26.666752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:59.613 [2024-05-14 12:05:26.666836] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:28:59.613 [2024-05-14 12:05:26.666840] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.872 [2024-05-14 12:05:26.851113] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:59.872 [2024-05-14 12:05:26.851174] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:59.872 [2024-05-14 12:05:26.851188] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.872 [2024-05-14 12:05:26.859136] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:59.872 [2024-05-14 12:05:26.859154] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:59.872 [2024-05-14 12:05:26.859166] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.872 [2024-05-14 12:05:26.867157] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:59.872 [2024-05-14 12:05:26.867179] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:59.872 [2024-05-14 12:05:26.867190] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:00.439 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:00.439 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:29:00.439 12:05:27 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:00.439 I/O targets: 00:29:00.439 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:29:00.439 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:29:00.439 00:29:00.439 00:29:00.439 CUnit - A unit testing framework for C - Version 2.1-3 00:29:00.439 http://cunit.sourceforge.net/ 00:29:00.439 00:29:00.439 00:29:00.439 Suite: bdevio tests on: crypto_ram3 00:29:00.439 Test: blockdev write read block ...passed 00:29:00.439 Test: blockdev write zeroes read block ...passed 00:29:00.439 Test: blockdev write zeroes read no split ...passed 00:29:00.439 Test: blockdev write zeroes read split ...passed 00:29:00.439 Test: blockdev write zeroes read split partial ...passed 00:29:00.439 Test: blockdev reset ...passed 00:29:00.439 Test: blockdev write read 8 blocks ...passed 00:29:00.439 Test: blockdev write read size > 128k ...passed 00:29:00.439 Test: blockdev write read invalid size ...passed 00:29:00.439 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:00.439 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:00.439 Test: blockdev write read max offset ...passed 00:29:00.439 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:00.439 Test: blockdev writev readv 8 blocks ...passed 00:29:00.439 Test: blockdev writev readv 30 x 1block ...passed 00:29:00.439 Test: blockdev writev readv block ...passed 00:29:00.439 Test: blockdev writev readv size > 128k ...passed 00:29:00.439 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:00.440 Test: blockdev comparev and writev ...passed 00:29:00.440 Test: blockdev nvme passthru rw ...passed 00:29:00.440 Test: blockdev nvme passthru vendor specific ...passed 00:29:00.440 Test: blockdev nvme admin passthru ...passed 00:29:00.440 Test: blockdev copy ...passed 00:29:00.440 Suite: bdevio tests on: crypto_ram 00:29:00.440 Test: blockdev write read block ...passed 00:29:00.440 Test: blockdev write zeroes read block ...passed 00:29:00.440 Test: blockdev write zeroes read no split ...passed 00:29:00.440 Test: blockdev write zeroes read split ...passed 00:29:00.440 Test: blockdev write zeroes read split partial ...passed 00:29:00.440 Test: blockdev reset ...passed 00:29:00.440 Test: blockdev write read 8 blocks ...passed 00:29:00.440 Test: blockdev write read size > 128k ...passed 00:29:00.440 Test: blockdev write read invalid size ...passed 00:29:00.440 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:00.440 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:00.440 Test: blockdev write read max offset ...passed 00:29:00.440 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:00.440 Test: blockdev writev readv 8 blocks ...passed 00:29:00.440 Test: blockdev writev readv 30 x 1block ...passed 00:29:00.440 Test: blockdev writev readv block ...passed 00:29:00.440 Test: blockdev writev readv size > 128k ...passed 00:29:00.440 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:00.440 Test: blockdev comparev and writev ...passed 00:29:00.440 Test: blockdev nvme passthru rw ...passed 00:29:00.440 Test: blockdev nvme passthru vendor specific ...passed 00:29:00.440 Test: blockdev nvme admin passthru ...passed 00:29:00.440 Test: blockdev copy ...passed 00:29:00.440 00:29:00.440 Run Summary: Type Total Ran Passed Failed Inactive 00:29:00.440 suites 2 2 n/a 0 0 00:29:00.440 tests 46 46 46 0 0 00:29:00.440 asserts 260 260 260 0 n/a 00:29:00.440 00:29:00.440 Elapsed time = 0.087 seconds 00:29:00.440 0 00:29:00.440 12:05:27 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1833498 00:29:00.440 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 1833498 ']' 00:29:00.440 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 1833498 00:29:00.440 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1833498 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1833498' 00:29:00.699 killing process with pid 1833498 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@965 -- # kill 1833498 00:29:00.699 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@970 -- # wait 1833498 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:29:00.958 00:29:00.958 real 0m1.420s 00:29:00.958 user 0m3.676s 00:29:00.958 sys 0m0.381s 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:00.958 ************************************ 00:29:00.958 END TEST bdev_bounds 00:29:00.958 ************************************ 00:29:00.958 12:05:27 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:00.958 12:05:27 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:29:00.958 12:05:27 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:00.958 12:05:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:00.958 ************************************ 00:29:00.958 START TEST bdev_nbd 00:29:00.958 ************************************ 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:29:00.958 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1833712 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1833712 /var/tmp/spdk-nbd.sock 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 1833712 ']' 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:29:00.959 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:00.959 12:05:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:00.959 [2024-05-14 12:05:27.920793] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:00.959 [2024-05-14 12:05:27.920835] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:00.959 [2024-05-14 12:05:28.032501] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:01.218 [2024-05-14 12:05:28.137431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:01.477 [2024-05-14 12:05:28.307552] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:01.477 [2024-05-14 12:05:28.307610] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:01.477 [2024-05-14 12:05:28.307625] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:01.477 [2024-05-14 12:05:28.315570] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:01.477 [2024-05-14 12:05:28.315588] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:01.477 [2024-05-14 12:05:28.315599] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:01.477 [2024-05-14 12:05:28.323591] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:01.477 [2024-05-14 12:05:28.323609] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:01.477 [2024-05-14 12:05:28.323620] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:02.045 12:05:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:29:02.045 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.304 1+0 records in 00:29:02.304 1+0 records out 00:29:02.304 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232388 s, 17.6 MB/s 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:02.304 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:02.563 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:02.563 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:02.563 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:02.563 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:29:02.563 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.564 1+0 records in 00:29:02.564 1+0 records out 00:29:02.564 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000350076 s, 11.7 MB/s 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:02.564 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:02.823 { 00:29:02.823 "nbd_device": "/dev/nbd0", 00:29:02.823 "bdev_name": "crypto_ram" 00:29:02.823 }, 00:29:02.823 { 00:29:02.823 "nbd_device": "/dev/nbd1", 00:29:02.823 "bdev_name": "crypto_ram3" 00:29:02.823 } 00:29:02.823 ]' 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:02.823 { 00:29:02.823 "nbd_device": "/dev/nbd0", 00:29:02.823 "bdev_name": "crypto_ram" 00:29:02.823 }, 00:29:02.823 { 00:29:02.823 "nbd_device": "/dev/nbd1", 00:29:02.823 "bdev_name": "crypto_ram3" 00:29:02.823 } 00:29:02.823 ]' 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:02.823 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.082 12:05:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.341 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:03.600 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:03.601 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:03.860 /dev/nbd0 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:03.860 1+0 records in 00:29:03.860 1+0 records out 00:29:03.860 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262599 s, 15.6 MB/s 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:03.860 12:05:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:29:04.119 /dev/nbd1 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:04.119 1+0 records in 00:29:04.119 1+0 records out 00:29:04.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314553 s, 13.0 MB/s 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:04.119 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:04.379 { 00:29:04.379 "nbd_device": "/dev/nbd0", 00:29:04.379 "bdev_name": "crypto_ram" 00:29:04.379 }, 00:29:04.379 { 00:29:04.379 "nbd_device": "/dev/nbd1", 00:29:04.379 "bdev_name": "crypto_ram3" 00:29:04.379 } 00:29:04.379 ]' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:04.379 { 00:29:04.379 "nbd_device": "/dev/nbd0", 00:29:04.379 "bdev_name": "crypto_ram" 00:29:04.379 }, 00:29:04.379 { 00:29:04.379 "nbd_device": "/dev/nbd1", 00:29:04.379 "bdev_name": "crypto_ram3" 00:29:04.379 } 00:29:04.379 ]' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:04.379 /dev/nbd1' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:04.379 /dev/nbd1' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:04.379 256+0 records in 00:29:04.379 256+0 records out 00:29:04.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010653 s, 98.4 MB/s 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:04.379 256+0 records in 00:29:04.379 256+0 records out 00:29:04.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314471 s, 33.3 MB/s 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:04.379 256+0 records in 00:29:04.379 256+0 records out 00:29:04.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0392576 s, 26.7 MB/s 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:04.379 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:04.639 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:04.898 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:04.898 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:04.898 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:04.898 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:04.898 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:05.194 12:05:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:05.194 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:05.195 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:05.483 malloc_lvol_verify 00:29:05.483 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:05.741 05385b01-99b5-4a57-9928-f578fd5b3826 00:29:05.741 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:06.000 e0fcc3cc-c138-4ec6-901d-91d0dcf65bd6 00:29:06.000 12:05:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:06.259 /dev/nbd0 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:06.259 mke2fs 1.46.5 (30-Dec-2021) 00:29:06.259 Discarding device blocks: 0/4096 done 00:29:06.259 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:06.259 00:29:06.259 Allocating group tables: 0/1 done 00:29:06.259 Writing inode tables: 0/1 done 00:29:06.259 Creating journal (1024 blocks): done 00:29:06.259 Writing superblocks and filesystem accounting information: 0/1 done 00:29:06.259 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:06.259 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1833712 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 1833712 ']' 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 1833712 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1833712 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1833712' 00:29:06.519 killing process with pid 1833712 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@965 -- # kill 1833712 00:29:06.519 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@970 -- # wait 1833712 00:29:06.778 12:05:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:06.778 00:29:06.778 real 0m5.872s 00:29:06.778 user 0m8.386s 00:29:06.778 sys 0m2.301s 00:29:06.778 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:06.778 12:05:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:06.778 ************************************ 00:29:06.778 END TEST bdev_nbd 00:29:06.778 ************************************ 00:29:06.778 12:05:33 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:06.778 12:05:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:29:06.778 12:05:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:29:06.778 12:05:33 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:06.778 12:05:33 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:06.778 12:05:33 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:06.779 12:05:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:06.779 ************************************ 00:29:06.779 START TEST bdev_fio 00:29:06.779 ************************************ 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:06.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:06.779 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:07.038 12:05:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:07.038 ************************************ 00:29:07.038 START TEST bdev_fio_rw_verify 00:29:07.038 ************************************ 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:29:07.039 12:05:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:07.039 12:05:34 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:07.298 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:07.298 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:07.298 fio-3.35 00:29:07.298 Starting 2 threads 00:29:19.512 00:29:19.512 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1834822: Tue May 14 12:05:44 2024 00:29:19.512 read: IOPS=31.5k, BW=123MiB/s (129MB/s)(1231MiB/10000msec) 00:29:19.512 slat (nsec): min=8926, max=56906, avg=14187.28, stdev=2990.08 00:29:19.512 clat (usec): min=4, max=323, avg=101.89, stdev=41.50 00:29:19.512 lat (usec): min=17, max=350, avg=116.07, stdev=42.64 00:29:19.512 clat percentiles (usec): 00:29:19.513 | 50.000th=[ 99], 99.000th=[ 200], 99.900th=[ 219], 99.990th=[ 243], 00:29:19.513 | 99.999th=[ 310] 00:29:19.513 write: IOPS=37.9k, BW=148MiB/s (155MB/s)(1404MiB/9477msec); 0 zone resets 00:29:19.513 slat (usec): min=8, max=1662, avg=23.28, stdev= 4.62 00:29:19.513 clat (usec): min=5, max=1868, avg=136.20, stdev=62.96 00:29:19.513 lat (usec): min=27, max=1891, avg=159.48, stdev=64.33 00:29:19.513 clat percentiles (usec): 00:29:19.513 | 50.000th=[ 133], 99.000th=[ 273], 99.900th=[ 306], 99.990th=[ 586], 00:29:19.513 | 99.999th=[ 840] 00:29:19.513 bw ( KiB/s): min=135488, max=149600, per=94.78%, avg=143800.42, stdev=2120.90, samples=38 00:29:19.513 iops : min=33872, max=37400, avg=35950.11, stdev=530.22, samples=38 00:29:19.513 lat (usec) : 10=0.01%, 20=0.01%, 50=9.17%, 100=31.85%, 250=56.68% 00:29:19.513 lat (usec) : 500=2.28%, 750=0.01%, 1000=0.01% 00:29:19.513 lat (msec) : 2=0.01% 00:29:19.513 cpu : usr=99.65%, sys=0.00%, ctx=27, majf=0, minf=396 00:29:19.513 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:19.513 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:19.513 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:19.513 issued rwts: total=315248,359474,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:19.513 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:19.513 00:29:19.513 Run status group 0 (all jobs): 00:29:19.513 READ: bw=123MiB/s (129MB/s), 123MiB/s-123MiB/s (129MB/s-129MB/s), io=1231MiB (1291MB), run=10000-10000msec 00:29:19.513 WRITE: bw=148MiB/s (155MB/s), 148MiB/s-148MiB/s (155MB/s-155MB/s), io=1404MiB (1472MB), run=9477-9477msec 00:29:19.513 00:29:19.513 real 0m11.193s 00:29:19.513 user 0m23.597s 00:29:19.513 sys 0m0.359s 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:19.513 ************************************ 00:29:19.513 END TEST bdev_fio_rw_verify 00:29:19.513 ************************************ 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ff2f0007-58cd-5e70-be54-5140b34c1ba2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff2f0007-58cd-5e70-be54-5140b34c1ba2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:19.513 crypto_ram3 ]] 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ff2f0007-58cd-5e70-be54-5140b34c1ba2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ff2f0007-58cd-5e70-be54-5140b34c1ba2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "8b21b50e-3315-5644-b6fa-3fc3f6fb2de9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:19.513 ************************************ 00:29:19.513 START TEST bdev_fio_trim 00:29:19.513 ************************************ 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:19.513 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:19.514 12:05:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.514 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.514 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.514 fio-3.35 00:29:19.514 Starting 2 threads 00:29:29.494 00:29:29.494 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1836332: Tue May 14 12:05:56 2024 00:29:29.494 write: IOPS=30.9k, BW=121MiB/s (127MB/s)(1209MiB/10001msec); 0 zone resets 00:29:29.494 slat (usec): min=9, max=468, avg=27.97, stdev=12.51 00:29:29.494 clat (usec): min=24, max=1890, avg=211.19, stdev=152.80 00:29:29.494 lat (usec): min=33, max=1903, avg=239.16, stdev=162.26 00:29:29.494 clat percentiles (usec): 00:29:29.494 | 50.000th=[ 161], 99.000th=[ 603], 99.900th=[ 652], 99.990th=[ 717], 00:29:29.494 | 99.999th=[ 1123] 00:29:29.494 bw ( KiB/s): min=88128, max=191232, per=100.00%, avg=125086.74, stdev=23214.77, samples=38 00:29:29.494 iops : min=22032, max=47808, avg=31271.68, stdev=5803.69, samples=38 00:29:29.494 trim: IOPS=30.9k, BW=121MiB/s (127MB/s)(1209MiB/10001msec); 0 zone resets 00:29:29.494 slat (usec): min=4, max=1736, avg=13.73, stdev= 7.45 00:29:29.494 clat (usec): min=33, max=1903, avg=140.04, stdev=65.87 00:29:29.494 lat (usec): min=38, max=1910, avg=153.77, stdev=69.94 00:29:29.494 clat percentiles (usec): 00:29:29.494 | 50.000th=[ 127], 99.000th=[ 322], 99.900th=[ 351], 99.990th=[ 429], 00:29:29.494 | 99.999th=[ 750] 00:29:29.494 bw ( KiB/s): min=88128, max=191232, per=100.00%, avg=125088.00, stdev=23215.18, samples=38 00:29:29.494 iops : min=22032, max=47808, avg=31272.00, stdev=5803.79, samples=38 00:29:29.494 lat (usec) : 50=5.17%, 100=27.79%, 250=45.32%, 500=18.63%, 750=3.09% 00:29:29.494 lat (usec) : 1000=0.01% 00:29:29.494 lat (msec) : 2=0.01% 00:29:29.494 cpu : usr=99.61%, sys=0.00%, ctx=30, majf=0, minf=350 00:29:29.494 IO depths : 1=7.9%, 2=18.2%, 4=59.1%, 8=14.8%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:29.494 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:29.494 complete : 0=0.0%, 4=87.1%, 8=12.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:29.494 issued rwts: total=0,309474,309474,0 short=0,0,0,0 dropped=0,0,0,0 00:29:29.494 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:29.494 00:29:29.494 Run status group 0 (all jobs): 00:29:29.494 WRITE: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1209MiB (1268MB), run=10001-10001msec 00:29:29.494 TRIM: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1209MiB (1268MB), run=10001-10001msec 00:29:29.494 00:29:29.494 real 0m11.202s 00:29:29.494 user 0m23.615s 00:29:29.494 sys 0m0.356s 00:29:29.494 12:05:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:29.494 12:05:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:29.494 ************************************ 00:29:29.494 END TEST bdev_fio_trim 00:29:29.494 ************************************ 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:29.754 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:29.754 00:29:29.754 real 0m22.773s 00:29:29.754 user 0m47.395s 00:29:29.754 sys 0m0.920s 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:29.754 ************************************ 00:29:29.754 END TEST bdev_fio 00:29:29.754 ************************************ 00:29:29.754 12:05:56 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:29.754 12:05:56 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:29.754 12:05:56 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:29.754 12:05:56 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:29.754 12:05:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:29.754 ************************************ 00:29:29.754 START TEST bdev_verify 00:29:29.754 ************************************ 00:29:29.754 12:05:56 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:29.754 [2024-05-14 12:05:56.735761] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:29.754 [2024-05-14 12:05:56.735800] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1837741 ] 00:29:30.014 [2024-05-14 12:05:56.844663] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:30.014 [2024-05-14 12:05:56.943944] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:30.014 [2024-05-14 12:05:56.943948] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.273 [2024-05-14 12:05:57.114769] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:30.273 [2024-05-14 12:05:57.114829] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:30.273 [2024-05-14 12:05:57.114843] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:30.273 [2024-05-14 12:05:57.122793] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:30.273 [2024-05-14 12:05:57.122811] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:30.273 [2024-05-14 12:05:57.122823] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:30.273 [2024-05-14 12:05:57.130814] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:30.273 [2024-05-14 12:05:57.130831] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:30.273 [2024-05-14 12:05:57.130842] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:30.273 Running I/O for 5 seconds... 00:29:35.548 00:29:35.548 Latency(us) 00:29:35.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:35.548 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:35.548 Verification LBA range: start 0x0 length 0x800 00:29:35.548 crypto_ram : 5.01 5619.73 21.95 0.00 0.00 22676.64 1652.65 31457.28 00:29:35.548 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:35.548 Verification LBA range: start 0x800 length 0x800 00:29:35.548 crypto_ram : 5.03 5627.70 21.98 0.00 0.00 22648.03 1823.61 31457.28 00:29:35.548 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:35.548 Verification LBA range: start 0x0 length 0x800 00:29:35.548 crypto_ram3 : 5.03 2824.16 11.03 0.00 0.00 45039.02 2279.51 36016.31 00:29:35.548 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:35.548 Verification LBA range: start 0x800 length 0x800 00:29:35.548 crypto_ram3 : 5.03 2822.70 11.03 0.00 0.00 45057.16 1738.13 36016.31 00:29:35.548 =================================================================================================================== 00:29:35.548 Total : 16894.29 65.99 0.00 0.00 30158.00 1652.65 36016.31 00:29:35.548 00:29:35.548 real 0m5.764s 00:29:35.548 user 0m10.918s 00:29:35.548 sys 0m0.202s 00:29:35.548 12:06:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:35.548 12:06:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:35.548 ************************************ 00:29:35.548 END TEST bdev_verify 00:29:35.548 ************************************ 00:29:35.548 12:06:02 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:35.548 12:06:02 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:35.548 12:06:02 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:35.548 12:06:02 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:35.548 ************************************ 00:29:35.548 START TEST bdev_verify_big_io 00:29:35.548 ************************************ 00:29:35.548 12:06:02 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:35.548 [2024-05-14 12:06:02.620542] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:35.548 [2024-05-14 12:06:02.620609] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1838590 ] 00:29:35.807 [2024-05-14 12:06:02.752980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:35.807 [2024-05-14 12:06:02.855706] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:35.807 [2024-05-14 12:06:02.855710] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:36.066 [2024-05-14 12:06:03.024571] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:36.066 [2024-05-14 12:06:03.024644] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:36.066 [2024-05-14 12:06:03.024659] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:36.066 [2024-05-14 12:06:03.032593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:36.066 [2024-05-14 12:06:03.032615] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:36.066 [2024-05-14 12:06:03.032626] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:36.066 [2024-05-14 12:06:03.040613] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:36.066 [2024-05-14 12:06:03.040639] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:36.066 [2024-05-14 12:06:03.040651] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:36.066 Running I/O for 5 seconds... 00:29:41.408 00:29:41.408 Latency(us) 00:29:41.408 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:41.408 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:41.408 Verification LBA range: start 0x0 length 0x80 00:29:41.408 crypto_ram : 5.31 409.84 25.61 0.00 0.00 304357.25 7693.36 408488.74 00:29:41.408 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:41.408 Verification LBA range: start 0x80 length 0x80 00:29:41.408 crypto_ram : 5.31 409.72 25.61 0.00 0.00 304452.47 7351.43 408488.74 00:29:41.408 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:41.408 Verification LBA range: start 0x0 length 0x80 00:29:41.408 crypto_ram3 : 5.32 216.34 13.52 0.00 0.00 553168.42 6553.60 428548.45 00:29:41.408 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:41.408 Verification LBA range: start 0x80 length 0x80 00:29:41.408 crypto_ram3 : 5.33 216.28 13.52 0.00 0.00 553335.10 6382.64 428548.45 00:29:41.408 =================================================================================================================== 00:29:41.408 Total : 1252.17 78.26 0.00 0.00 390544.17 6382.64 428548.45 00:29:41.667 00:29:41.667 real 0m6.127s 00:29:41.667 user 0m11.528s 00:29:41.667 sys 0m0.248s 00:29:41.667 12:06:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:41.667 12:06:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:41.667 ************************************ 00:29:41.667 END TEST bdev_verify_big_io 00:29:41.667 ************************************ 00:29:41.667 12:06:08 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:41.667 12:06:08 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:41.667 12:06:08 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:41.667 12:06:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:41.927 ************************************ 00:29:41.927 START TEST bdev_write_zeroes 00:29:41.927 ************************************ 00:29:41.927 12:06:08 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:41.927 [2024-05-14 12:06:08.810574] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:41.927 [2024-05-14 12:06:08.810617] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1839852 ] 00:29:41.927 [2024-05-14 12:06:08.923060] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:42.186 [2024-05-14 12:06:09.020217] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:42.186 [2024-05-14 12:06:09.190413] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:42.186 [2024-05-14 12:06:09.190479] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:42.186 [2024-05-14 12:06:09.190494] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.186 [2024-05-14 12:06:09.198435] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:42.186 [2024-05-14 12:06:09.198453] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:42.186 [2024-05-14 12:06:09.198472] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.186 [2024-05-14 12:06:09.206453] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:42.186 [2024-05-14 12:06:09.206471] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:42.186 [2024-05-14 12:06:09.206482] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.186 Running I/O for 1 seconds... 00:29:43.565 00:29:43.565 Latency(us) 00:29:43.565 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:43.565 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:43.565 crypto_ram : 1.01 26550.69 103.71 0.00 0.00 4808.99 2080.06 7522.39 00:29:43.565 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:43.565 crypto_ram3 : 1.01 13304.81 51.97 0.00 0.00 9549.67 3333.79 11910.46 00:29:43.565 =================================================================================================================== 00:29:43.565 Total : 39855.50 155.69 0.00 0.00 6394.25 2080.06 11910.46 00:29:43.565 00:29:43.565 real 0m1.728s 00:29:43.565 user 0m1.502s 00:29:43.565 sys 0m0.206s 00:29:43.565 12:06:10 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:43.565 12:06:10 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:43.565 ************************************ 00:29:43.565 END TEST bdev_write_zeroes 00:29:43.565 ************************************ 00:29:43.565 12:06:10 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:43.565 12:06:10 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:43.565 12:06:10 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:43.565 12:06:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:43.565 ************************************ 00:29:43.565 START TEST bdev_json_nonenclosed 00:29:43.565 ************************************ 00:29:43.565 12:06:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:43.833 [2024-05-14 12:06:10.657165] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:43.833 [2024-05-14 12:06:10.657229] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1840051 ] 00:29:43.833 [2024-05-14 12:06:10.786134] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:43.833 [2024-05-14 12:06:10.883591] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:43.833 [2024-05-14 12:06:10.883661] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:43.833 [2024-05-14 12:06:10.883681] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:43.833 [2024-05-14 12:06:10.883693] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:44.094 00:29:44.094 real 0m0.387s 00:29:44.094 user 0m0.227s 00:29:44.094 sys 0m0.157s 00:29:44.094 12:06:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:44.094 12:06:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:44.094 ************************************ 00:29:44.094 END TEST bdev_json_nonenclosed 00:29:44.094 ************************************ 00:29:44.094 12:06:11 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:44.094 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:44.094 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:44.094 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:44.094 ************************************ 00:29:44.094 START TEST bdev_json_nonarray 00:29:44.094 ************************************ 00:29:44.094 12:06:11 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:44.094 [2024-05-14 12:06:11.128210] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:44.094 [2024-05-14 12:06:11.128268] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1840077 ] 00:29:44.354 [2024-05-14 12:06:11.254481] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:44.354 [2024-05-14 12:06:11.351709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:44.354 [2024-05-14 12:06:11.351782] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:44.354 [2024-05-14 12:06:11.351802] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:44.354 [2024-05-14 12:06:11.351814] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:44.613 00:29:44.613 real 0m0.379s 00:29:44.613 user 0m0.231s 00:29:44.613 sys 0m0.146s 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:44.613 ************************************ 00:29:44.613 END TEST bdev_json_nonarray 00:29:44.613 ************************************ 00:29:44.613 12:06:11 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:29:44.613 12:06:11 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:29:44.613 12:06:11 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:29:44.613 12:06:11 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:29:44.613 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:29:44.613 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:44.613 12:06:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:44.613 ************************************ 00:29:44.613 START TEST bdev_crypto_enomem 00:29:44.613 ************************************ 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1121 -- # bdev_crypto_enomem 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1840262 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1840262 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@827 -- # '[' -z 1840262 ']' 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:44.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:44.613 12:06:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:44.613 [2024-05-14 12:06:11.593673] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:44.613 [2024-05-14 12:06:11.593741] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1840262 ] 00:29:44.872 [2024-05-14 12:06:11.714241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:44.872 [2024-05-14 12:06:11.816173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # return 0 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:45.441 true 00:29:45.441 base0 00:29:45.441 true 00:29:45.441 [2024-05-14 12:06:12.481952] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:45.441 crypt0 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@895 -- # local bdev_name=crypt0 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local i 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:45.441 [ 00:29:45.441 { 00:29:45.441 "name": "crypt0", 00:29:45.441 "aliases": [ 00:29:45.441 "aeac0237-02ef-5f03-a768-5329f65e630e" 00:29:45.441 ], 00:29:45.441 "product_name": "crypto", 00:29:45.441 "block_size": 512, 00:29:45.441 "num_blocks": 2097152, 00:29:45.441 "uuid": "aeac0237-02ef-5f03-a768-5329f65e630e", 00:29:45.441 "assigned_rate_limits": { 00:29:45.441 "rw_ios_per_sec": 0, 00:29:45.441 "rw_mbytes_per_sec": 0, 00:29:45.441 "r_mbytes_per_sec": 0, 00:29:45.441 "w_mbytes_per_sec": 0 00:29:45.441 }, 00:29:45.441 "claimed": false, 00:29:45.441 "zoned": false, 00:29:45.441 "supported_io_types": { 00:29:45.441 "read": true, 00:29:45.441 "write": true, 00:29:45.441 "unmap": false, 00:29:45.441 "write_zeroes": true, 00:29:45.441 "flush": false, 00:29:45.441 "reset": true, 00:29:45.441 "compare": false, 00:29:45.441 "compare_and_write": false, 00:29:45.441 "abort": false, 00:29:45.441 "nvme_admin": false, 00:29:45.441 "nvme_io": false 00:29:45.441 }, 00:29:45.441 "memory_domains": [ 00:29:45.441 { 00:29:45.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:45.441 "dma_device_type": 2 00:29:45.441 } 00:29:45.441 ], 00:29:45.441 "driver_specific": { 00:29:45.441 "crypto": { 00:29:45.441 "base_bdev_name": "EE_base0", 00:29:45.441 "name": "crypt0", 00:29:45.441 "key_name": "test_dek_sw" 00:29:45.441 } 00:29:45.441 } 00:29:45.441 } 00:29:45.441 ] 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # return 0 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1840289 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:29:45.441 12:06:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:45.700 Running I/O for 5 seconds... 00:29:46.633 12:06:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:29:46.633 12:06:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:46.633 12:06:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:46.633 12:06:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:46.633 12:06:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1840289 00:29:50.823 00:29:50.823 Latency(us) 00:29:50.823 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:50.823 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:29:50.823 crypt0 : 5.00 35814.51 139.90 0.00 0.00 889.36 414.94 1182.50 00:29:50.823 =================================================================================================================== 00:29:50.823 Total : 35814.51 139.90 0.00 0.00 889.36 414.94 1182.50 00:29:50.823 0 00:29:50.823 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:29:50.823 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:50.823 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:50.823 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:50.823 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1840262 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@946 -- # '[' -z 1840262 ']' 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # kill -0 1840262 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # uname 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1840262 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1840262' 00:29:50.824 killing process with pid 1840262 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@965 -- # kill 1840262 00:29:50.824 Received shutdown signal, test time was about 5.000000 seconds 00:29:50.824 00:29:50.824 Latency(us) 00:29:50.824 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:50.824 =================================================================================================================== 00:29:50.824 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:50.824 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@970 -- # wait 1840262 00:29:51.083 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:29:51.083 00:29:51.083 real 0m6.376s 00:29:51.083 user 0m6.564s 00:29:51.083 sys 0m0.376s 00:29:51.083 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:51.084 12:06:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:29:51.084 ************************************ 00:29:51.084 END TEST bdev_crypto_enomem 00:29:51.084 ************************************ 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:29:51.084 12:06:17 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:29:51.084 00:29:51.084 real 0m54.479s 00:29:51.084 user 1m33.370s 00:29:51.084 sys 0m6.362s 00:29:51.084 12:06:17 blockdev_crypto_sw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:51.084 12:06:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:51.084 ************************************ 00:29:51.084 END TEST blockdev_crypto_sw 00:29:51.084 ************************************ 00:29:51.084 12:06:17 -- spdk/autotest.sh@355 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:29:51.084 12:06:17 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:51.084 12:06:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:51.084 12:06:17 -- common/autotest_common.sh@10 -- # set +x 00:29:51.084 ************************************ 00:29:51.084 START TEST blockdev_crypto_qat 00:29:51.084 ************************************ 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:29:51.084 * Looking for test storage... 00:29:51.084 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1841083 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1841083 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@827 -- # '[' -z 1841083 ']' 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:51.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:51.084 12:06:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:51.084 12:06:18 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:29:51.343 [2024-05-14 12:06:18.199282] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:51.343 [2024-05-14 12:06:18.199358] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1841083 ] 00:29:51.343 [2024-05-14 12:06:18.327473] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:51.602 [2024-05-14 12:06:18.430698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:52.170 12:06:19 blockdev_crypto_qat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:52.170 12:06:19 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # return 0 00:29:52.170 12:06:19 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:29:52.170 12:06:19 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:29:52.170 12:06:19 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:29:52.170 12:06:19 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:52.170 12:06:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:52.170 [2024-05-14 12:06:19.124890] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:52.170 [2024-05-14 12:06:19.132925] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:52.170 [2024-05-14 12:06:19.140942] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:52.170 [2024-05-14 12:06:19.215666] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:54.707 true 00:29:54.707 true 00:29:54.707 true 00:29:54.707 true 00:29:54.707 Malloc0 00:29:54.707 Malloc1 00:29:54.707 Malloc2 00:29:54.707 Malloc3 00:29:54.707 [2024-05-14 12:06:21.600980] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:54.707 crypto_ram 00:29:54.707 [2024-05-14 12:06:21.608998] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:54.707 crypto_ram1 00:29:54.707 [2024-05-14 12:06:21.617018] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:54.707 crypto_ram2 00:29:54.707 [2024-05-14 12:06:21.625040] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:54.707 crypto_ram3 00:29:54.707 [ 00:29:54.707 { 00:29:54.707 "name": "Malloc1", 00:29:54.707 "aliases": [ 00:29:54.707 "2f0fc4a4-a6fe-4839-8a7c-a01877e9b594" 00:29:54.707 ], 00:29:54.707 "product_name": "Malloc disk", 00:29:54.707 "block_size": 512, 00:29:54.707 "num_blocks": 65536, 00:29:54.707 "uuid": "2f0fc4a4-a6fe-4839-8a7c-a01877e9b594", 00:29:54.707 "assigned_rate_limits": { 00:29:54.707 "rw_ios_per_sec": 0, 00:29:54.707 "rw_mbytes_per_sec": 0, 00:29:54.707 "r_mbytes_per_sec": 0, 00:29:54.707 "w_mbytes_per_sec": 0 00:29:54.707 }, 00:29:54.707 "claimed": true, 00:29:54.707 "claim_type": "exclusive_write", 00:29:54.707 "zoned": false, 00:29:54.707 "supported_io_types": { 00:29:54.707 "read": true, 00:29:54.707 "write": true, 00:29:54.707 "unmap": true, 00:29:54.707 "write_zeroes": true, 00:29:54.707 "flush": true, 00:29:54.707 "reset": true, 00:29:54.707 "compare": false, 00:29:54.707 "compare_and_write": false, 00:29:54.707 "abort": true, 00:29:54.707 "nvme_admin": false, 00:29:54.707 "nvme_io": false 00:29:54.707 }, 00:29:54.707 "memory_domains": [ 00:29:54.707 { 00:29:54.707 "dma_device_id": "system", 00:29:54.707 "dma_device_type": 1 00:29:54.707 }, 00:29:54.707 { 00:29:54.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:54.707 "dma_device_type": 2 00:29:54.707 } 00:29:54.707 ], 00:29:54.707 "driver_specific": {} 00:29:54.707 } 00:29:54.707 ] 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.707 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:29:54.707 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.967 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4d0ca15-30f0-5893-ae33-dd68db0808d2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a4d0ca15-30f0-5893-ae33-dd68db0808d2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "e6df3945-1551-5680-bc12-7f1cbf658241"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e6df3945-1551-5680-bc12-7f1cbf658241",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "61873d5e-499e-521a-bcac-b8c1ad8e4844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "61873d5e-499e-521a-bcac-b8c1ad8e4844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "58211217-e71a-5e2a-8798-40bfdb33d85c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "58211217-e71a-5e2a-8798-40bfdb33d85c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:29:54.967 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:29:54.968 12:06:21 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1841083 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@946 -- # '[' -z 1841083 ']' 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # kill -0 1841083 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # uname 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1841083 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1841083' 00:29:54.968 killing process with pid 1841083 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@965 -- # kill 1841083 00:29:54.968 12:06:21 blockdev_crypto_qat -- common/autotest_common.sh@970 -- # wait 1841083 00:29:55.535 12:06:22 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:55.535 12:06:22 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:55.535 12:06:22 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:29:55.535 12:06:22 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:55.535 12:06:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:55.535 ************************************ 00:29:55.535 START TEST bdev_hello_world 00:29:55.535 ************************************ 00:29:55.535 12:06:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:55.535 [2024-05-14 12:06:22.511890] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:55.535 [2024-05-14 12:06:22.511948] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1841711 ] 00:29:55.794 [2024-05-14 12:06:22.638474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.794 [2024-05-14 12:06:22.734926] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.794 [2024-05-14 12:06:22.756367] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:55.794 [2024-05-14 12:06:22.764387] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:55.794 [2024-05-14 12:06:22.772411] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:55.794 [2024-05-14 12:06:22.871818] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:58.332 [2024-05-14 12:06:25.076347] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:58.332 [2024-05-14 12:06:25.076419] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:58.332 [2024-05-14 12:06:25.076434] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:58.332 [2024-05-14 12:06:25.084365] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:58.332 [2024-05-14 12:06:25.084385] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:58.332 [2024-05-14 12:06:25.084405] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:58.332 [2024-05-14 12:06:25.092385] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:58.332 [2024-05-14 12:06:25.092410] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:58.332 [2024-05-14 12:06:25.092421] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:58.332 [2024-05-14 12:06:25.100413] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:58.332 [2024-05-14 12:06:25.100430] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:58.332 [2024-05-14 12:06:25.100441] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:58.332 [2024-05-14 12:06:25.178160] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:29:58.332 [2024-05-14 12:06:25.178202] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:29:58.332 [2024-05-14 12:06:25.178222] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:29:58.332 [2024-05-14 12:06:25.179510] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:29:58.332 [2024-05-14 12:06:25.179585] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:29:58.332 [2024-05-14 12:06:25.179602] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:29:58.332 [2024-05-14 12:06:25.179646] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:29:58.332 00:29:58.332 [2024-05-14 12:06:25.179665] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:29:58.592 00:29:58.592 real 0m3.139s 00:29:58.592 user 0m2.733s 00:29:58.592 sys 0m0.371s 00:29:58.592 12:06:25 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:58.592 12:06:25 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:29:58.592 ************************************ 00:29:58.592 END TEST bdev_hello_world 00:29:58.592 ************************************ 00:29:58.592 12:06:25 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:29:58.592 12:06:25 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:58.592 12:06:25 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:58.592 12:06:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:58.592 ************************************ 00:29:58.592 START TEST bdev_bounds 00:29:58.592 ************************************ 00:29:58.592 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1842109 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1842109' 00:29:58.852 Process bdevio pid: 1842109 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1842109 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 1842109 ']' 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:58.852 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:58.852 12:06:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:58.852 [2024-05-14 12:06:25.733467] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:29:58.852 [2024-05-14 12:06:25.733529] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1842109 ] 00:29:58.852 [2024-05-14 12:06:25.862819] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:59.111 [2024-05-14 12:06:25.973211] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:59.111 [2024-05-14 12:06:25.973299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:59.111 [2024-05-14 12:06:25.973294] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:29:59.111 [2024-05-14 12:06:25.994661] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:59.111 [2024-05-14 12:06:26.002693] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:59.111 [2024-05-14 12:06:26.010713] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:59.111 [2024-05-14 12:06:26.109777] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:01.688 [2024-05-14 12:06:28.326322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:01.688 [2024-05-14 12:06:28.326395] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:01.688 [2024-05-14 12:06:28.326417] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:01.688 [2024-05-14 12:06:28.334342] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:01.688 [2024-05-14 12:06:28.334361] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:01.688 [2024-05-14 12:06:28.334373] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:01.688 [2024-05-14 12:06:28.342362] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:01.688 [2024-05-14 12:06:28.342379] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:01.688 [2024-05-14 12:06:28.342390] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:01.688 [2024-05-14 12:06:28.350392] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:01.688 [2024-05-14 12:06:28.350416] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:01.688 [2024-05-14 12:06:28.350428] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:01.688 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:01.688 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:30:01.688 12:06:28 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:01.688 I/O targets: 00:30:01.688 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:01.688 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:30:01.688 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:30:01.688 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:01.688 00:30:01.688 00:30:01.688 CUnit - A unit testing framework for C - Version 2.1-3 00:30:01.688 http://cunit.sourceforge.net/ 00:30:01.688 00:30:01.688 00:30:01.688 Suite: bdevio tests on: crypto_ram3 00:30:01.688 Test: blockdev write read block ...passed 00:30:01.688 Test: blockdev write zeroes read block ...passed 00:30:01.688 Test: blockdev write zeroes read no split ...passed 00:30:01.688 Test: blockdev write zeroes read split ...passed 00:30:01.688 Test: blockdev write zeroes read split partial ...passed 00:30:01.688 Test: blockdev reset ...passed 00:30:01.688 Test: blockdev write read 8 blocks ...passed 00:30:01.688 Test: blockdev write read size > 128k ...passed 00:30:01.688 Test: blockdev write read invalid size ...passed 00:30:01.688 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:01.688 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:01.688 Test: blockdev write read max offset ...passed 00:30:01.688 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:01.688 Test: blockdev writev readv 8 blocks ...passed 00:30:01.688 Test: blockdev writev readv 30 x 1block ...passed 00:30:01.688 Test: blockdev writev readv block ...passed 00:30:01.688 Test: blockdev writev readv size > 128k ...passed 00:30:01.688 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:01.688 Test: blockdev comparev and writev ...passed 00:30:01.688 Test: blockdev nvme passthru rw ...passed 00:30:01.688 Test: blockdev nvme passthru vendor specific ...passed 00:30:01.688 Test: blockdev nvme admin passthru ...passed 00:30:01.688 Test: blockdev copy ...passed 00:30:01.688 Suite: bdevio tests on: crypto_ram2 00:30:01.688 Test: blockdev write read block ...passed 00:30:01.688 Test: blockdev write zeroes read block ...passed 00:30:01.688 Test: blockdev write zeroes read no split ...passed 00:30:01.688 Test: blockdev write zeroes read split ...passed 00:30:01.688 Test: blockdev write zeroes read split partial ...passed 00:30:01.688 Test: blockdev reset ...passed 00:30:01.688 Test: blockdev write read 8 blocks ...passed 00:30:01.688 Test: blockdev write read size > 128k ...passed 00:30:01.688 Test: blockdev write read invalid size ...passed 00:30:01.688 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:01.688 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:01.688 Test: blockdev write read max offset ...passed 00:30:01.688 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:01.688 Test: blockdev writev readv 8 blocks ...passed 00:30:01.688 Test: blockdev writev readv 30 x 1block ...passed 00:30:01.688 Test: blockdev writev readv block ...passed 00:30:01.688 Test: blockdev writev readv size > 128k ...passed 00:30:01.688 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:01.688 Test: blockdev comparev and writev ...passed 00:30:01.688 Test: blockdev nvme passthru rw ...passed 00:30:01.688 Test: blockdev nvme passthru vendor specific ...passed 00:30:01.688 Test: blockdev nvme admin passthru ...passed 00:30:01.688 Test: blockdev copy ...passed 00:30:01.688 Suite: bdevio tests on: crypto_ram1 00:30:01.688 Test: blockdev write read block ...passed 00:30:01.688 Test: blockdev write zeroes read block ...passed 00:30:01.688 Test: blockdev write zeroes read no split ...passed 00:30:01.688 Test: blockdev write zeroes read split ...passed 00:30:01.688 Test: blockdev write zeroes read split partial ...passed 00:30:01.688 Test: blockdev reset ...passed 00:30:01.689 Test: blockdev write read 8 blocks ...passed 00:30:01.689 Test: blockdev write read size > 128k ...passed 00:30:01.689 Test: blockdev write read invalid size ...passed 00:30:01.689 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:01.689 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:01.689 Test: blockdev write read max offset ...passed 00:30:01.689 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:01.689 Test: blockdev writev readv 8 blocks ...passed 00:30:01.689 Test: blockdev writev readv 30 x 1block ...passed 00:30:01.689 Test: blockdev writev readv block ...passed 00:30:01.689 Test: blockdev writev readv size > 128k ...passed 00:30:01.689 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:01.689 Test: blockdev comparev and writev ...passed 00:30:01.689 Test: blockdev nvme passthru rw ...passed 00:30:01.689 Test: blockdev nvme passthru vendor specific ...passed 00:30:01.689 Test: blockdev nvme admin passthru ...passed 00:30:01.689 Test: blockdev copy ...passed 00:30:01.689 Suite: bdevio tests on: crypto_ram 00:30:01.689 Test: blockdev write read block ...passed 00:30:01.689 Test: blockdev write zeroes read block ...passed 00:30:01.689 Test: blockdev write zeroes read no split ...passed 00:30:01.689 Test: blockdev write zeroes read split ...passed 00:30:01.948 Test: blockdev write zeroes read split partial ...passed 00:30:01.949 Test: blockdev reset ...passed 00:30:01.949 Test: blockdev write read 8 blocks ...passed 00:30:01.949 Test: blockdev write read size > 128k ...passed 00:30:01.949 Test: blockdev write read invalid size ...passed 00:30:01.949 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:01.949 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:01.949 Test: blockdev write read max offset ...passed 00:30:01.949 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:01.949 Test: blockdev writev readv 8 blocks ...passed 00:30:01.949 Test: blockdev writev readv 30 x 1block ...passed 00:30:01.949 Test: blockdev writev readv block ...passed 00:30:01.949 Test: blockdev writev readv size > 128k ...passed 00:30:01.949 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:01.949 Test: blockdev comparev and writev ...passed 00:30:01.949 Test: blockdev nvme passthru rw ...passed 00:30:01.949 Test: blockdev nvme passthru vendor specific ...passed 00:30:01.949 Test: blockdev nvme admin passthru ...passed 00:30:01.949 Test: blockdev copy ...passed 00:30:01.949 00:30:01.949 Run Summary: Type Total Ran Passed Failed Inactive 00:30:01.949 suites 4 4 n/a 0 0 00:30:01.949 tests 92 92 92 0 0 00:30:01.949 asserts 520 520 520 0 n/a 00:30:01.949 00:30:01.949 Elapsed time = 0.531 seconds 00:30:01.949 0 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1842109 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 1842109 ']' 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 1842109 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1842109 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1842109' 00:30:01.949 killing process with pid 1842109 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@965 -- # kill 1842109 00:30:01.949 12:06:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@970 -- # wait 1842109 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:02.522 00:30:02.522 real 0m3.637s 00:30:02.522 user 0m10.118s 00:30:02.522 sys 0m0.586s 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:02.522 ************************************ 00:30:02.522 END TEST bdev_bounds 00:30:02.522 ************************************ 00:30:02.522 12:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:02.522 12:06:29 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:30:02.522 12:06:29 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:02.522 12:06:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:02.522 ************************************ 00:30:02.522 START TEST bdev_nbd 00:30:02.522 ************************************ 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1842635 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1842635 /var/tmp/spdk-nbd.sock 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 1842635 ']' 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:02.522 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:02.522 12:06:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:02.522 [2024-05-14 12:06:29.474292] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:30:02.522 [2024-05-14 12:06:29.474355] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:02.522 [2024-05-14 12:06:29.602952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.781 [2024-05-14 12:06:29.709553] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.781 [2024-05-14 12:06:29.730887] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:02.781 [2024-05-14 12:06:29.738910] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:02.781 [2024-05-14 12:06:29.746928] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:02.781 [2024-05-14 12:06:29.855614] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:05.317 [2024-05-14 12:06:32.066847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:05.317 [2024-05-14 12:06:32.066913] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:05.317 [2024-05-14 12:06:32.066929] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:05.317 [2024-05-14 12:06:32.074867] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:05.317 [2024-05-14 12:06:32.074887] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:05.317 [2024-05-14 12:06:32.074899] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:05.317 [2024-05-14 12:06:32.082886] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:05.317 [2024-05-14 12:06:32.082905] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:05.317 [2024-05-14 12:06:32.082917] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:05.317 [2024-05-14 12:06:32.090907] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:05.317 [2024-05-14 12:06:32.090925] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:05.317 [2024-05-14 12:06:32.090936] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:05.317 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:05.577 1+0 records in 00:30:05.577 1+0 records out 00:30:05.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285981 s, 14.3 MB/s 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:05.577 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:05.837 1+0 records in 00:30:05.837 1+0 records out 00:30:05.837 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000356135 s, 11.5 MB/s 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:05.837 12:06:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:06.097 1+0 records in 00:30:06.097 1+0 records out 00:30:06.097 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278121 s, 14.7 MB/s 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:06.097 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:06.356 1+0 records in 00:30:06.356 1+0 records out 00:30:06.356 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003274 s, 12.5 MB/s 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:06.356 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd0", 00:30:06.616 "bdev_name": "crypto_ram" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd1", 00:30:06.616 "bdev_name": "crypto_ram1" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd2", 00:30:06.616 "bdev_name": "crypto_ram2" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd3", 00:30:06.616 "bdev_name": "crypto_ram3" 00:30:06.616 } 00:30:06.616 ]' 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd0", 00:30:06.616 "bdev_name": "crypto_ram" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd1", 00:30:06.616 "bdev_name": "crypto_ram1" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd2", 00:30:06.616 "bdev_name": "crypto_ram2" 00:30:06.616 }, 00:30:06.616 { 00:30:06.616 "nbd_device": "/dev/nbd3", 00:30:06.616 "bdev_name": "crypto_ram3" 00:30:06.616 } 00:30:06.616 ]' 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:06.616 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:06.875 12:06:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:07.135 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:07.393 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:07.393 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:07.393 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:07.394 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:07.653 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:07.913 12:06:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:08.172 /dev/nbd0 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:08.172 1+0 records in 00:30:08.172 1+0 records out 00:30:08.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292837 s, 14.0 MB/s 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:08.172 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:30:08.432 /dev/nbd1 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:08.432 1+0 records in 00:30:08.432 1+0 records out 00:30:08.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320917 s, 12.8 MB/s 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:08.432 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:30:08.691 /dev/nbd10 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:08.691 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:08.950 1+0 records in 00:30:08.950 1+0 records out 00:30:08.950 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321438 s, 12.7 MB/s 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:08.950 12:06:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:30:08.950 /dev/nbd11 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:30:09.209 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:09.210 1+0 records in 00:30:09.210 1+0 records out 00:30:09.210 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306185 s, 13.4 MB/s 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:09.210 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd0", 00:30:09.469 "bdev_name": "crypto_ram" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd1", 00:30:09.469 "bdev_name": "crypto_ram1" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd10", 00:30:09.469 "bdev_name": "crypto_ram2" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd11", 00:30:09.469 "bdev_name": "crypto_ram3" 00:30:09.469 } 00:30:09.469 ]' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd0", 00:30:09.469 "bdev_name": "crypto_ram" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd1", 00:30:09.469 "bdev_name": "crypto_ram1" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd10", 00:30:09.469 "bdev_name": "crypto_ram2" 00:30:09.469 }, 00:30:09.469 { 00:30:09.469 "nbd_device": "/dev/nbd11", 00:30:09.469 "bdev_name": "crypto_ram3" 00:30:09.469 } 00:30:09.469 ]' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:09.469 /dev/nbd1 00:30:09.469 /dev/nbd10 00:30:09.469 /dev/nbd11' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:09.469 /dev/nbd1 00:30:09.469 /dev/nbd10 00:30:09.469 /dev/nbd11' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:09.469 256+0 records in 00:30:09.469 256+0 records out 00:30:09.469 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00661701 s, 158 MB/s 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:09.469 256+0 records in 00:30:09.469 256+0 records out 00:30:09.469 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.082886 s, 12.7 MB/s 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:09.469 256+0 records in 00:30:09.469 256+0 records out 00:30:09.469 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0635951 s, 16.5 MB/s 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:09.469 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:09.729 256+0 records in 00:30:09.729 256+0 records out 00:30:09.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0577128 s, 18.2 MB/s 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:09.729 256+0 records in 00:30:09.729 256+0 records out 00:30:09.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.055454 s, 18.9 MB/s 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:09.729 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:09.988 12:06:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:10.248 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:10.507 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:10.767 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:11.025 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:11.025 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:11.025 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:11.025 12:06:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:11.025 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:11.026 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:11.026 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:11.026 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:11.026 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:11.026 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:11.284 malloc_lvol_verify 00:30:11.284 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:11.543 5a05208c-cb62-4a57-a3e7-5393baac8f74 00:30:11.543 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:11.802 bd08787b-8f83-4446-a72a-113c54cd8336 00:30:11.802 12:06:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:12.060 /dev/nbd0 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:12.060 mke2fs 1.46.5 (30-Dec-2021) 00:30:12.060 Discarding device blocks: 0/4096 done 00:30:12.060 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:12.060 00:30:12.060 Allocating group tables: 0/1 done 00:30:12.060 Writing inode tables: 0/1 done 00:30:12.060 Creating journal (1024 blocks): done 00:30:12.060 Writing superblocks and filesystem accounting information: 0/1 done 00:30:12.060 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:12.060 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1842635 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 1842635 ']' 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 1842635 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1842635 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1842635' 00:30:12.320 killing process with pid 1842635 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@965 -- # kill 1842635 00:30:12.320 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@970 -- # wait 1842635 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:12.887 00:30:12.887 real 0m10.375s 00:30:12.887 user 0m13.490s 00:30:12.887 sys 0m4.150s 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:12.887 ************************************ 00:30:12.887 END TEST bdev_nbd 00:30:12.887 ************************************ 00:30:12.887 12:06:39 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:12.887 12:06:39 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:30:12.887 12:06:39 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:30:12.887 12:06:39 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:12.887 12:06:39 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:30:12.887 12:06:39 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:12.887 12:06:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:12.887 ************************************ 00:30:12.887 START TEST bdev_fio 00:30:12.887 ************************************ 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:12.887 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:12.887 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:12.888 12:06:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:13.146 ************************************ 00:30:13.146 START TEST bdev_fio_rw_verify 00:30:13.146 ************************************ 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:13.146 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:13.147 12:06:40 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:13.405 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:13.405 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:13.405 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:13.405 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:13.405 fio-3.35 00:30:13.405 Starting 4 threads 00:30:28.386 00:30:28.386 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1844600: Tue May 14 12:06:53 2024 00:30:28.386 read: IOPS=20.0k, BW=78.1MiB/s (81.9MB/s)(782MiB/10001msec) 00:30:28.386 slat (usec): min=17, max=1520, avg=68.71, stdev=36.12 00:30:28.386 clat (usec): min=16, max=2198, avg=380.99, stdev=238.35 00:30:28.386 lat (usec): min=69, max=2253, avg=449.69, stdev=256.82 00:30:28.386 clat percentiles (usec): 00:30:28.386 | 50.000th=[ 318], 99.000th=[ 1123], 99.900th=[ 1254], 99.990th=[ 1450], 00:30:28.386 | 99.999th=[ 1532] 00:30:28.386 write: IOPS=22.0k, BW=86.0MiB/s (90.2MB/s)(837MiB/9729msec); 0 zone resets 00:30:28.386 slat (usec): min=25, max=492, avg=81.86, stdev=36.79 00:30:28.386 clat (usec): min=36, max=1530, avg=424.71, stdev=255.12 00:30:28.386 lat (usec): min=81, max=1643, avg=506.56, stdev=274.66 00:30:28.386 clat percentiles (usec): 00:30:28.386 | 50.000th=[ 367], 99.000th=[ 1221], 99.900th=[ 1352], 99.990th=[ 1434], 00:30:28.386 | 99.999th=[ 1500] 00:30:28.386 bw ( KiB/s): min=70528, max=125512, per=97.44%, avg=85796.47, stdev=3067.15, samples=76 00:30:28.386 iops : min=17632, max=31378, avg=21449.05, stdev=766.78, samples=76 00:30:28.386 lat (usec) : 20=0.01%, 50=0.01%, 100=1.86%, 250=29.94%, 500=40.23% 00:30:28.386 lat (usec) : 750=17.82%, 1000=6.73% 00:30:28.386 lat (msec) : 2=3.41%, 4=0.01% 00:30:28.386 cpu : usr=99.60%, sys=0.00%, ctx=69, majf=0, minf=307 00:30:28.386 IO depths : 1=4.8%, 2=27.2%, 4=54.4%, 8=13.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:28.386 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:28.386 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:28.386 issued rwts: total=200077,214149,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:28.386 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:28.386 00:30:28.386 Run status group 0 (all jobs): 00:30:28.386 READ: bw=78.1MiB/s (81.9MB/s), 78.1MiB/s-78.1MiB/s (81.9MB/s-81.9MB/s), io=782MiB (820MB), run=10001-10001msec 00:30:28.386 WRITE: bw=86.0MiB/s (90.2MB/s), 86.0MiB/s-86.0MiB/s (90.2MB/s-90.2MB/s), io=837MiB (877MB), run=9729-9729msec 00:30:28.386 00:30:28.386 real 0m13.526s 00:30:28.386 user 0m45.808s 00:30:28.386 sys 0m0.490s 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:28.386 ************************************ 00:30:28.386 END TEST bdev_fio_rw_verify 00:30:28.386 ************************************ 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:30:28.386 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4d0ca15-30f0-5893-ae33-dd68db0808d2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a4d0ca15-30f0-5893-ae33-dd68db0808d2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "e6df3945-1551-5680-bc12-7f1cbf658241"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e6df3945-1551-5680-bc12-7f1cbf658241",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "61873d5e-499e-521a-bcac-b8c1ad8e4844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "61873d5e-499e-521a-bcac-b8c1ad8e4844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "58211217-e71a-5e2a-8798-40bfdb33d85c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "58211217-e71a-5e2a-8798-40bfdb33d85c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:28.387 crypto_ram1 00:30:28.387 crypto_ram2 00:30:28.387 crypto_ram3 ]] 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "a4d0ca15-30f0-5893-ae33-dd68db0808d2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "a4d0ca15-30f0-5893-ae33-dd68db0808d2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "e6df3945-1551-5680-bc12-7f1cbf658241"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e6df3945-1551-5680-bc12-7f1cbf658241",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "61873d5e-499e-521a-bcac-b8c1ad8e4844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "61873d5e-499e-521a-bcac-b8c1ad8e4844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "58211217-e71a-5e2a-8798-40bfdb33d85c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "58211217-e71a-5e2a-8798-40bfdb33d85c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:28.387 ************************************ 00:30:28.387 START TEST bdev_fio_trim 00:30:28.387 ************************************ 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:30:28.387 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:28.388 12:06:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:28.388 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:28.388 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:28.388 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:28.388 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:28.388 fio-3.35 00:30:28.388 Starting 4 threads 00:30:40.600 00:30:40.600 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1846465: Tue May 14 12:07:06 2024 00:30:40.600 write: IOPS=34.4k, BW=134MiB/s (141MB/s)(1344MiB/10001msec); 0 zone resets 00:30:40.600 slat (usec): min=18, max=1138, avg=67.32, stdev=26.04 00:30:40.600 clat (usec): min=38, max=1256, avg=241.42, stdev=126.62 00:30:40.600 lat (usec): min=66, max=1621, avg=308.73, stdev=138.45 00:30:40.600 clat percentiles (usec): 00:30:40.600 | 50.000th=[ 219], 99.000th=[ 627], 99.900th=[ 742], 99.990th=[ 816], 00:30:40.600 | 99.999th=[ 1139] 00:30:40.600 bw ( KiB/s): min=107584, max=225984, per=100.00%, avg=138499.37, stdev=9244.44, samples=76 00:30:40.600 iops : min=26896, max=56496, avg=34624.84, stdev=2311.11, samples=76 00:30:40.600 trim: IOPS=34.4k, BW=134MiB/s (141MB/s)(1344MiB/10001msec); 0 zone resets 00:30:40.600 slat (usec): min=4, max=341, avg=19.74, stdev= 9.94 00:30:40.600 clat (usec): min=66, max=1621, avg=308.91, stdev=138.46 00:30:40.600 lat (usec): min=71, max=1657, avg=328.66, stdev=143.01 00:30:40.600 clat percentiles (usec): 00:30:40.600 | 50.000th=[ 289], 99.000th=[ 725], 99.900th=[ 848], 99.990th=[ 947], 00:30:40.600 | 99.999th=[ 1336] 00:30:40.600 bw ( KiB/s): min=107584, max=225984, per=100.00%, avg=138499.37, stdev=9244.44, samples=76 00:30:40.600 iops : min=26896, max=56496, avg=34624.84, stdev=2311.11, samples=76 00:30:40.600 lat (usec) : 50=0.10%, 100=6.88%, 250=41.37%, 500=45.58%, 750=5.66% 00:30:40.600 lat (usec) : 1000=0.39% 00:30:40.600 lat (msec) : 2=0.01% 00:30:40.600 cpu : usr=99.61%, sys=0.00%, ctx=62, majf=0, minf=100 00:30:40.600 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:40.600 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:40.600 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:40.600 issued rwts: total=0,344062,344062,0 short=0,0,0,0 dropped=0,0,0,0 00:30:40.600 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:40.600 00:30:40.600 Run status group 0 (all jobs): 00:30:40.600 WRITE: bw=134MiB/s (141MB/s), 134MiB/s-134MiB/s (141MB/s-141MB/s), io=1344MiB (1409MB), run=10001-10001msec 00:30:40.600 TRIM: bw=134MiB/s (141MB/s), 134MiB/s-134MiB/s (141MB/s-141MB/s), io=1344MiB (1409MB), run=10001-10001msec 00:30:40.600 00:30:40.600 real 0m13.490s 00:30:40.600 user 0m45.905s 00:30:40.600 sys 0m0.454s 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:40.600 ************************************ 00:30:40.600 END TEST bdev_fio_trim 00:30:40.600 ************************************ 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:40.600 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:40.600 00:30:40.600 real 0m27.377s 00:30:40.600 user 1m31.899s 00:30:40.600 sys 0m1.130s 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:40.600 ************************************ 00:30:40.600 END TEST bdev_fio 00:30:40.600 ************************************ 00:30:40.600 12:07:07 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:40.600 12:07:07 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:40.600 12:07:07 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:30:40.600 12:07:07 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:40.600 12:07:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:40.600 ************************************ 00:30:40.600 START TEST bdev_verify 00:30:40.600 ************************************ 00:30:40.600 12:07:07 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:40.600 [2024-05-14 12:07:07.383196] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:30:40.600 [2024-05-14 12:07:07.383255] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1847808 ] 00:30:40.601 [2024-05-14 12:07:07.512179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:40.601 [2024-05-14 12:07:07.612712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:40.601 [2024-05-14 12:07:07.612718] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:40.601 [2024-05-14 12:07:07.634075] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:40.601 [2024-05-14 12:07:07.642104] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:40.601 [2024-05-14 12:07:07.650125] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:40.860 [2024-05-14 12:07:07.751866] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:43.400 [2024-05-14 12:07:09.962608] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:43.400 [2024-05-14 12:07:09.962686] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:43.400 [2024-05-14 12:07:09.962702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:43.400 [2024-05-14 12:07:09.970624] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:43.400 [2024-05-14 12:07:09.970645] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:43.400 [2024-05-14 12:07:09.970657] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:43.400 [2024-05-14 12:07:09.978645] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:43.400 [2024-05-14 12:07:09.978663] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:43.400 [2024-05-14 12:07:09.978674] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:43.400 [2024-05-14 12:07:09.986666] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:43.400 [2024-05-14 12:07:09.986683] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:43.400 [2024-05-14 12:07:09.986694] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:43.400 Running I/O for 5 seconds... 00:30:48.678 00:30:48.678 Latency(us) 00:30:48.678 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:48.678 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x0 length 0x1000 00:30:48.678 crypto_ram : 5.08 491.47 1.92 0.00 0.00 258899.04 3447.76 177802.02 00:30:48.678 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x1000 length 0x1000 00:30:48.678 crypto_ram : 5.08 497.42 1.94 0.00 0.00 255945.01 4217.10 177802.02 00:30:48.678 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x0 length 0x1000 00:30:48.678 crypto_ram1 : 5.08 494.43 1.93 0.00 0.00 256663.38 3362.28 160477.72 00:30:48.678 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x1000 length 0x1000 00:30:48.678 crypto_ram1 : 5.08 501.92 1.96 0.00 0.00 253219.09 4103.12 160477.72 00:30:48.678 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x0 length 0x1000 00:30:48.678 crypto_ram2 : 5.06 3844.19 15.02 0.00 0.00 32942.03 4473.54 25872.47 00:30:48.678 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x1000 length 0x1000 00:30:48.678 crypto_ram2 : 5.06 3869.34 15.11 0.00 0.00 32714.94 6781.55 25872.47 00:30:48.678 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x0 length 0x1000 00:30:48.678 crypto_ram3 : 5.06 3841.62 15.01 0.00 0.00 32872.37 5698.78 25758.50 00:30:48.678 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:48.678 Verification LBA range: start 0x1000 length 0x1000 00:30:48.678 crypto_ram3 : 5.07 3875.71 15.14 0.00 0.00 32581.90 1837.86 25530.55 00:30:48.678 =================================================================================================================== 00:30:48.678 Total : 17416.10 68.03 0.00 0.00 58318.28 1837.86 177802.02 00:30:48.678 00:30:48.678 real 0m8.233s 00:30:48.678 user 0m15.601s 00:30:48.678 sys 0m0.389s 00:30:48.678 12:07:15 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:48.678 12:07:15 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:30:48.678 ************************************ 00:30:48.678 END TEST bdev_verify 00:30:48.678 ************************************ 00:30:48.678 12:07:15 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:48.678 12:07:15 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:30:48.678 12:07:15 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:48.678 12:07:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.678 ************************************ 00:30:48.678 START TEST bdev_verify_big_io 00:30:48.678 ************************************ 00:30:48.678 12:07:15 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:48.678 [2024-05-14 12:07:15.697089] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:30:48.678 [2024-05-14 12:07:15.697149] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1848870 ] 00:30:48.938 [2024-05-14 12:07:15.824960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:48.938 [2024-05-14 12:07:15.928159] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:48.938 [2024-05-14 12:07:15.928165] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:48.938 [2024-05-14 12:07:15.949551] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:48.938 [2024-05-14 12:07:15.957582] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:48.938 [2024-05-14 12:07:15.965601] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:49.197 [2024-05-14 12:07:16.072271] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:51.733 [2024-05-14 12:07:18.275099] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:51.733 [2024-05-14 12:07:18.275178] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:51.733 [2024-05-14 12:07:18.275192] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.733 [2024-05-14 12:07:18.283117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:51.733 [2024-05-14 12:07:18.283136] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:51.733 [2024-05-14 12:07:18.283147] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.733 [2024-05-14 12:07:18.291136] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:51.733 [2024-05-14 12:07:18.291153] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:51.733 [2024-05-14 12:07:18.291164] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.733 [2024-05-14 12:07:18.299159] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:51.733 [2024-05-14 12:07:18.299189] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:51.733 [2024-05-14 12:07:18.299201] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.733 Running I/O for 5 seconds... 00:30:52.304 [2024-05-14 12:07:19.210393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.210819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.210897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.210946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.210988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.211032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.211434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.211452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.215813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.216232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.216249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.219760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.219807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.219849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.219892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.220784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.224636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.224702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.224770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.224827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.225822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.229982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.230025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.230481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.230500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.233740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.233787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.233829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.233870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.234955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.238958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.239001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.304 [2024-05-14 12:07:19.239433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.239451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.242857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.242904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.242945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.242987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.243470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.243515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.243559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.243600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.243993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.244010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.247991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.248032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.248473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.248491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.251759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.251806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.251853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.251896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.252960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.256845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.257279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.257297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.260442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.260493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.260538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.260580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.261591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.264949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.264996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.265695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.266034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.266051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.269392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.269443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.269495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.269538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.270651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.273869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.273916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.273957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.274993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.278937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.279365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.279383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.282329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.282379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.282439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.282483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.305 [2024-05-14 12:07:19.282969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.283014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.283057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.283099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.283547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.283565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.286523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.286569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.286611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.286653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.287668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.290713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.290759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.290801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.290843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.291975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.295776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.296206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.296224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.299971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.300013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.300412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.300429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.303484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.303531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.303574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.303617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.304739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.307691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.307744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.307786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.307828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.308718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.311905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.311965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.312676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.313056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.313072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.315992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.316791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.317239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.317257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.320891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.321323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.321341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.306 [2024-05-14 12:07:19.324067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.324790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.325196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.325213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.328744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.329170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.329188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.332812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.333230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.333247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.335749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.336016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.336033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.338424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.338470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.338515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.338556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.339612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.341972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.342013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.342285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.342302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.344936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.345338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.346695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.348004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.349833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.350790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.352490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.354056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.354333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.354350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.357102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.357982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.359301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.360848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.362575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.363812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.365114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.366669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.366946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.366964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.370002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.371559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.373253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.374871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.375867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.377182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.378738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.380299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.380585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.380604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.384693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.386010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.307 [2024-05-14 12:07:19.387560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.389118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.391351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.392940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.394601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.396303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.396676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.396693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.400674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.402234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.403788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.404884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.406451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.408012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.409575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.410253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.410739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.410765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.414682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.416325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.418062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.419060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.420912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.422453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.423818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.424208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.424660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.424679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.428519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.430085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.430817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.432271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.434096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.435654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.436050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.436444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.436865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.436884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.440681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.442002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.443393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.444691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.446538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.447461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.447868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.448261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.448712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.448731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.452237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.453026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.454330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.455895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.457730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.458123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.458514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.458902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.459347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.459367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.462099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.463809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.465376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.467008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.467741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.468134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.468525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.468913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.469347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.469364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.472245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.473567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.475116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.476682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.477441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.477834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.478221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.478612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.478895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.478912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.481979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.483529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.485100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.486294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.487094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.487486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.487873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.488938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.489286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.489303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.492442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.494014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.570 [2024-05-14 12:07:19.495678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.496073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.496912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.497302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.497850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.499172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.499452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.499470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.502729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.504285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.505171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.505575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.506410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.506803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.508245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.509566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.509844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.509865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.513159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.514570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.514965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.515354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.516164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.517155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.518467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.520026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.520304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.520321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.523575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.523982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.524373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.524766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.525609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.527101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.528761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.530329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.530611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.530628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.533436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.533826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.534217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.534626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.536369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.537683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.539219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.540782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.541187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.541204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.543204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.543601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.543992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.544381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.546044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.547599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.549162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.550379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.550677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.550694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.552818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.553210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.553605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.554142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.555856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.557413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.559041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.559917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.560259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.560276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.562483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.562876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.563270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.564808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.566698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.568252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.569056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.570593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.570869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.570886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.573352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.573745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.574725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.576022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.577857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.579222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.580537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.581833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.582109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.582126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.584817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.585212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.586770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.588499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.590329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.591044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.592347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.593913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.571 [2024-05-14 12:07:19.594192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.594209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.596839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.598206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.599512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.601067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.602332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.604056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.605592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.607215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.607505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.607523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.610900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.612222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.613779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.615342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.616943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.618252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.619814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.621375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.621767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.621784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.626029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.627630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.629173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.630634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.632249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.633784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.635326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.636485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.636912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.636929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.640463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.642024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.643582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.644316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.646035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.647595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.649236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.649640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.650106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.650124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.572 [2024-05-14 12:07:19.653805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.835 [2024-05-14 12:07:19.655380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.835 [2024-05-14 12:07:19.656465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.835 [2024-05-14 12:07:19.658082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.835 [2024-05-14 12:07:19.659943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.661503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.662225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.662628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.663064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.663082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.666878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.668426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.669216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.670517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.672441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.674005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.674393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.674828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.675260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.675277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.678178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.679517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.680817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.682296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.683041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.683436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.683825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.684211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.684656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.684674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.687418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.687832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.688234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.688629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.689410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.689807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.690200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.690593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.691031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.691049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.693802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.694196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.694605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.694647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.695486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.695884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.696276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.696669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.697130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.697151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.699875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.700268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.700663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.701059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.701111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.701516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.701912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.702298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.702689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.703085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.703442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.703460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.705799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.705848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.705891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.705943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.706325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.706393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.706451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.706525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.706581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.707022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.707040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.709472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.709530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.709572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.709615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.709955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.710621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.712974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.713711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.714152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.714172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.716430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.716476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.716519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.716561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.717044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.836 [2024-05-14 12:07:19.717096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.717140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.717182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.717226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.717671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.717689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.720922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.721336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.721352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.723586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.723636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.723679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.723721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.724847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.727950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.728388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.728411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.730847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.730906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.730952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.730994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.731458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.731516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.731558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.731601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.731642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.732042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.732059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.734441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.734486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.734528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.734571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.735730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.738878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.739240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.739258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.741687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.741733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.741787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.741830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.742968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.745981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.746025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.746066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.746122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.746579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.746597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.748975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.749068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.837 [2024-05-14 12:07:19.749110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.749694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.750189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.750207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.752548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.752594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.752650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.752728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.753766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.755955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.756736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.757147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.757165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.759444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.759489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.759535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.759577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.760655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.762941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.762987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.763693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.764126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.764144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.766477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.766523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.766565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.766609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.767721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.770936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.771283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.771300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.773648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.773695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.773741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.773787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.774965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.777528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.777574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.777617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.777659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.778009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.778075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.778119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.778161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.838 [2024-05-14 12:07:19.778213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.778629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.778648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.781993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.782381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.782402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.784678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.784725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.784769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.784816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.785811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.787993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.788033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.788074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.788345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.788362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.790520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.790565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.790607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.790648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.791554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.793866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.794139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.794156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.796981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.797023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.797065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.797463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.797480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.799905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.801857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.801902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.801944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.801987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.802402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.802457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.802500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.802543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.802586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.803024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.839 [2024-05-14 12:07:19.803043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.804604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.804649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.804690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.804732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.805656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.807424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.807471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.807861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.807920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.808409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.808464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.808507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.808555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.808597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.809012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.809028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.810579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.810627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.810669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.812851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.815392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.815982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.817281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.818830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.819103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.820836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.821857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.823166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.824707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.824979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.824996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.827616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.829104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.830440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.831992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.832268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.833073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.834592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.836261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.837856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.838129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.838146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.841416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.842722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.844255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.845793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.846124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.847450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.848765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.850309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.851852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.852236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.852253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.856587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.858227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.859915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.861520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.840 [2024-05-14 12:07:19.861938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.863238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.864789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.866345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.867536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.867976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.867994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.871475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.873039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.874589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.875325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.875604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.877001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.878562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.880130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.880527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.880969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.880987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.884673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.886239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.887454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.888934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.889240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.890800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.892358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.893141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.893538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.893992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.894009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.897713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.899333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.900239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.901538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.901813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.903378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.904784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.905173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.905565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.906002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.906023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.909371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.910229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.911825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.913516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.913792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.915368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.915822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.916211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.916608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.917080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:52.841 [2024-05-14 12:07:19.917099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.920205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.921425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.922737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.924288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.924564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.925663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.926053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.926445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.926833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.927265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.927283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.929498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.930837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.932394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.933956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.934229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.934646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.935037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.935432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.935826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.936110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.936128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.939205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.940534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.942075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.943618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.943999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.944416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.944806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.945194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.946067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.946408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.946426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.949306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.950876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.952427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.953419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.953866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.954266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.954661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.955050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.956688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.956960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.956978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.960222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.961946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.963550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.963941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.964379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.964792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.965182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.966427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.967717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.967990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.968007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.971149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.972697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.973315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.973710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.974136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.974543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.975162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.976458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.977997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.978270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.978287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.981473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.982789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.983181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.983576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.984022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.984426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.985896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.987219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.988784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.989060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.989078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.992279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.111 [2024-05-14 12:07:19.992691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.993083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.993478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.993902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.994800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.996100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.997649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.999210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.999535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:19.999553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.002049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.002455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.002845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.003234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.003684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.005375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.006960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.008628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.010269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.010640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.010658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.012545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.012939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.013329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.013723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.014004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.015291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.016783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.018106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.018801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.019198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.019228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.021799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.022373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.023735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.025723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.026094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.027380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.029316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.031522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.032052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.032613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.032644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.036184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.038127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.040146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.041270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.041850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.042357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.042876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.044324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.045864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.046196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.046225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.049701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.050143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.050579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.051007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.051502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.053047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.054626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.056194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.057270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.057612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.057640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.059968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.060413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.060841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.061581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.061893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.063613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.065207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.066735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.067924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.068235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.068252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.070376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.070778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.071169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.072803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.073079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.074639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.076176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.076906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.078206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.078489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.078515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.080852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.081246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.082514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.083787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.084063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.085625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.086627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.088350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.089931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.112 [2024-05-14 12:07:20.090206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.090224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.092814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.093427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.095004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.096558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.096832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.097333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.098900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.100449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.101636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.102035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.102053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.104693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.105090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.105484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.105877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.106323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.106731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.107127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.107537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.107926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.108408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.108428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.111180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.111578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.111971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.112367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.112882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.113286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.113681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.114070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.114464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.114848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.114867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.117582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.117984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.118376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.118771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.119228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.119634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.120027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.120430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.120827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.121262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.121280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.124051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.124450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.124839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.125235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.125655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.126066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.126464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.126855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.127248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.127653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.127671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.130335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.130735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.131134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.131533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.132006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.132412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.132802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.133197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.133601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.134071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.134090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.136785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.137192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.137240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.137635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.138143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.138553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.138959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.139369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.139765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.140157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.140175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.142794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.143189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.143584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.143632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.143979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.144384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.144783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.145173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.145573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.146020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.146043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.113 [2024-05-14 12:07:20.148929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.148972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.149014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.149056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.149459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.149477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.151765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.151811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.151852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.151894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.152908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.155976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.156363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.156380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.158753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.158799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.158842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.158884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.159979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.162451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.162511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.162566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.162610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.163673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.166727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.167192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.167209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.169551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.169625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.169679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.169721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.170766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.173911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.174312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.174329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.176526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.176573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.176615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.176661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.177766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.180016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.180063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.114 [2024-05-14 12:07:20.180105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.180733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.181169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.181187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.183433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.183480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.183522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.183564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.184700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.187818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.188158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.115 [2024-05-14 12:07:20.188175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.190690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.190762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.190822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.190879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.191947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.194910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.195310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.195331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.197778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.406 [2024-05-14 12:07:20.197830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.197876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.197922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.198761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.200959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.201000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.201271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.201289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.203555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.203603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.203645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.203687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.204668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.206942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.207208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.207224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.209418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.209466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.209507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.209549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.209999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.210516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.212733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.213003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.213020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.214945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.214991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.215719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.216154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.216171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.217716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.217762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.217811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.217853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.218706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.220486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.220533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.220583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.220626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.221054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.221127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.221173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.221215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.407 [2024-05-14 12:07:20.221256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.221695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.221714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.223984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.224026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.224072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.224343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.224360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.225999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.226747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.227175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.227192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.229704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.230057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.230074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.231600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.231647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.231688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.231730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.232797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.234866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.234911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.234965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.235765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.237975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.238019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.238060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.238101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.238548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.238567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.240721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.240767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.240808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.240849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.241589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.243243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.243288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.244980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.245579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.246010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.246029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.248001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.248047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.248088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.249636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.249909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.408 [2024-05-14 12:07:20.249970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.250011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.250053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.250093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.250563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.250580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.252541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.252937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.253327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.253721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.254001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.255308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.256868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.258421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.259151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.259433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.259451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.261446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.261844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.262238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.263085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.263448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.265051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.266596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.268012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.269276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.269598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.269616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.271862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.272270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.272759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.274137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.274421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.275999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.277685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.278625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.279931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.280208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.280225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.282539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.282936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.284217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.285533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.285808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.287380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.288345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.290043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.291621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.291897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.291914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.294451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.295448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.296749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.298288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.298578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.299764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.301369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.302796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.304357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.304642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.304661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.307347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.308776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.310317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.311881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.312160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.313087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.314394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.315952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.317516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.317873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.317891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.322108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.323602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.325203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.326888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.327329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.328698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.330246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.331791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.333149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.333577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.333596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.337201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.338777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.340333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.341419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.341697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.343007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.344560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.346107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.346775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.347269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.347287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.351095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.409 [2024-05-14 12:07:20.352793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.354372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.355488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.355813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.357382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.358948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.360082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.360481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.360922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.360940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.364513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.366073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.367007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.368696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.368973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.370553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.372115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.372622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.373012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.373433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.373452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.376989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.378427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.379674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.380982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.381270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.382838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.383824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.384232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.384628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.385079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.385096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.388427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.389192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.390719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.392418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.392694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.394265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.394667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.395057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.395454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.395889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.395907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.398811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.400241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.401555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.403111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.403389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.404217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.404618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.405017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.405413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.405855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.405874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.408099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.409394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.410962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.412514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.412790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.413198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.413594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.413984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.414373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.414671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.414689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.418040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.419563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.421178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.422875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.423309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.423721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.424114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.424510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.425302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.425611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.425629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.428522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.430085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.431655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.432674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.433106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.433514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.433902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.434289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.435981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.436258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.436274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.439701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.441272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.442757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.443150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.443592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.443996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.444389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.445445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.446752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.410 [2024-05-14 12:07:20.447028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.447045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.450147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.451709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.452479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.452870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.453309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.453730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.454187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.455612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.457194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.457476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.457494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.460798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.462073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.462472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.462863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.463263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.463676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.464979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.466295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.467839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.468115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.468132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.471298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.471845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.472238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.472631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.473071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.473760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.475055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.476605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.478159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.478441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.411 [2024-05-14 12:07:20.478459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.480975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.481387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.481786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.482181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.482632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.484253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.485744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.487202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.487935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.488234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.488251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.490566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.490964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.491996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.493300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.493587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.495160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.496328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.497477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.498765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.499043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.499059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.501357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.501764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.502159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.502558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.503053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.503462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.503866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.504257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.504657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.505144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.505161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.508236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.508660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.509054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.509447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.509919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.510318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.510725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.511117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.511520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.511954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.511971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.514813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.515209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.515600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.515992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.516479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.516885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.517281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.517675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.518068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.518485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.518503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.521201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.521603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.522003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.522402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.522857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.523261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.523658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.524050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.524455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.524861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.524878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.527716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.528112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.528507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.694 [2024-05-14 12:07:20.528905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.529352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.529759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.530151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.530548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.530938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.531369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.531386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.533985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.534383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.534780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.535172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.535574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.535979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.536372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.536767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.537165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.537571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.537589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.540338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.540746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.541143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.541536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.541964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.542363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.542761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.543159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.543566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.544035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.544053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.546753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.547152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.547542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.547930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.548349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.548758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.549167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.549563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.549952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.550409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.550428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.553095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.553493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.553542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.553933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.554277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.554689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.555090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.555485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.555874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.556217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.556234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.558914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.559305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.559704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.559753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.560219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.560632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.561025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.561419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.561824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.562263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.562280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.565892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.566264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.566281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.568677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.568725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.568768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.568809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.569894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.572977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.573022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.573470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.695 [2024-05-14 12:07:20.573489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.575800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.575845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.575902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.575945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.576995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.577012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.579980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.580023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.580066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.580127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.580586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.580605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.582913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.582959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.583649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.584066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.584084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.586992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.587036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.587078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.587121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.587574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.587595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.589388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.589448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.589495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.589537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.589998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.590581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.592933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.592980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.593717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.594196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.594214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.595919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.595969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.596842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.598550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.598596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.598637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.598679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.599744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.601609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.601655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.601699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.601741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.602010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.602070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.602113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.696 [2024-05-14 12:07:20.602157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.602205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.602623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.602640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.604318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.604365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.604412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.604458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.604934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.605566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.607673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.607721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.607761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.607802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.608693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.610978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.611022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.611065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.611108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.611561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.611580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.613577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.613622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.613666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.613707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.613981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.614473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.616921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.617368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.617385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.619995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.620038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.620079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.620119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.620384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.620406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.622725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.623150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.623168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.625975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.626017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.626285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.626301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.628056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.628101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.628142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.628184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.697 [2024-05-14 12:07:20.628459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.628522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.628571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.628612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.628660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.629040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.629057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.631613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.631658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.631699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.631739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.632534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.634890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.635206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.635223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.637631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.637677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.637727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.637770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.638528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.640789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.641063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.641083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.643488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.643534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.643577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.643619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.643945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.644455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.646752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.647019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.647035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.649243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.649289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.698 [2024-05-14 12:07:20.649334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.649376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.649821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.649874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.649921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.649965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.650007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.650308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.650325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.652945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.655832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.656114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.656130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.657904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.657953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.659817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.660085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.660102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.662388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.662438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.662480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.662888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.663654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.666833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.668385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.669388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.669795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.670225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.670635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.671028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.672612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.674058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.674331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.674347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.677746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.679311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.679715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.680107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.680473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.680877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.681889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.683205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.684767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.685040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.685056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.688280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.688883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.689277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.689672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.690114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.690553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.691991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.693572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.695131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.695408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.695425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.698341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.698745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.699137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.699531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.699966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.701267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.702580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.704123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.705672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.706060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.706077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.699 [2024-05-14 12:07:20.708047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.708448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.708839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.709230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.709589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.710909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.712466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.714029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.715071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.715349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.715366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.717451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.717849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.718240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.718922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.719196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.720845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.722551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.724152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.725249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.725596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.725614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.727823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.728221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.728616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.730273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.730552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.732120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.733681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.734403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.735719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.735992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.736009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.738329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.738731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.739947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.741264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.741542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.743108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.744147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.745817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.747352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.747631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.747648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.750133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.750536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.751934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.753240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.753518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.755089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.755944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.757538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.759223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.759504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.759521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.762013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.762917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.764227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.765775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.766048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.767386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.768789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.770101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.771659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.771930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.771947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.774685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.700 [2024-05-14 12:07:20.776165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.962 [2024-05-14 12:07:20.777792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.779354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.779638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.780484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.781790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.783344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.784907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.785214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.785231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.789116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.790442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.791985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.793540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.793938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.795526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.797261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.798901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.800446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.800813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.800831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.804428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.805994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.807568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.808558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.808837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.810145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.811705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.813268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.813807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.814279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.814297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.818037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.819760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.821339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.822461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.822794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.824368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.825933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.827081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.827479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.827928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.827949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.831560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.833116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.833849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.835174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.835451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.837076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.838779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.839176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.839569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.839943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.839960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.843497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.844644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.846219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.847654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.847929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.849506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.850201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.850598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.850992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.851472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.851494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.854788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.855613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.856913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.858471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.858745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.860248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.860642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.861034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.861429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.861877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.861895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.864496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.866208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.867740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.869389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.869669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.870256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.870652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.871043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.871436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.871833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.871854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.875118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.876689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.877830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.878223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.878675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.879081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.879477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.880876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.882174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.882451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.963 [2024-05-14 12:07:20.882469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.885733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.887306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.887812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.888207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.888667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.889069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.889464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.889857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.890268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.890729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.890748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.893416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.893813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.894205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.894601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.895021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.895439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.895836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.896232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.896627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.897075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.897093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.899818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.900215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.900611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.901006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.901379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.901788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.902180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.902578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.902970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.903335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.903352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.906125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.906528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.906919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.907308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.907694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.908098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.908494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.908888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.909280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.909730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.909752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.912413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.912810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.913202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.913599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.913954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.914363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.914765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.915154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.915551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.915992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.916010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.918728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.919124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.919523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.919920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.920361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.920774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.921167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.921561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.921973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.922365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.922383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.925659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.926063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.926461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.926852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.927330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.927738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.928137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.928539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.928935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.929368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.929386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.933497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.933898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.934292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.935915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.936395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.936803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.937200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.937598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.939175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.939649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.939667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.942415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.942821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.944375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.944776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.945215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.946957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.964 [2024-05-14 12:07:20.947355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.947751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.948156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.948635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.948653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.950983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.951380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.951782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.952305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.952587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.952997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.953391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.954925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.955317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.955758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.955776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.958307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.959739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.960131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.960531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.961076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.961600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.962963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.963356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.964124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.964411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.964430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.968115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.968520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.968570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.969591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.969879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.970285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.970685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.971086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.972142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.972448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.972467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.975042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.975445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.976618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.976668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.977030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.977444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.978474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.979320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.979725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.980090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.980107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.982976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.983418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.983437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.985534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.985580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.985622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.985664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.986569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.988844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.988891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.988933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.988993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.989951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.992971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.993251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.993268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.965 [2024-05-14 12:07:20.995340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.995387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.995434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.995477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.995917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.995974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.996029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.996070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.996124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.996404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.996421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.998374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.998426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.998468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.998510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.998956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:20.999427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.001746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.001806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.001858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.001901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.002929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.005391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.005442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.005487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.005530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.005957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.006649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.008975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.009019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.009064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.009107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.009373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.009390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.011895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.012262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.012280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.014826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.015205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.015226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.016858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.016905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.016949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.017669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.018105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.018122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.966 [2024-05-14 12:07:21.020741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.020785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.020826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.020867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.021198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.021215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.022832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.022877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.022927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.022970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.023966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.025946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.025991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.026832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.028499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.028544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.028588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.028630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.029822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.033486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.033537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.033579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.033620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.033991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.034507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.038551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.038602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.038644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.038686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.039592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.043906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.044281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:53.967 [2024-05-14 12:07:21.044298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.047723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.048035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.048053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.053935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.054367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.054385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.058972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.059857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.063660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.063710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.063758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.063801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.064558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.069344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.069414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.069473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.069516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.069953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.070526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.229 [2024-05-14 12:07:21.074726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.074769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.074815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.075089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.075106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.079906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.080179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.080195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.084763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.084812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.084854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.084895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.085988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.089838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.089892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.089936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.089978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.090767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.094692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.094743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.094784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.094825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.095742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.099980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.100342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.100360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.103148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.103197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.104917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.104965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.105731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.107493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.107548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.107590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.107977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.108386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.108453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.108496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.108538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.108579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.109010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.109028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.111975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.113434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.114745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.116308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.116584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.117444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.117836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.118224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.118616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.119010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.119027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.121657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.122962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.124501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.230 [2024-05-14 12:07:21.126059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.126393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.126806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.127192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.127598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.128086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.128357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.128374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.131328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.132911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.134485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.135408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.135858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.136259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.136651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.137040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.138621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.138892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.138909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.142199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.143772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.145051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.145443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.145882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.146279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.146673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.148161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.149505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.149776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.149797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.153161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.154715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.155102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.155492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.155927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.156330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.157596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.158905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.160434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.160706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.160723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.163975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.164378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.164774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.165163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.165599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.166451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.167755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.169310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.170850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.171159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.171176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.173539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.173936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.174323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.174714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.175100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.176501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.178046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.179600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.180940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.181292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.181310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.183415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.183813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.184204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.184881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.185155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.186821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.188467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.190032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.191187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.191505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.191523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.193784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.194193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.194680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.196060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.196332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.197890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.199495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.200348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.201659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.201934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.201951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.204297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.204696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.206345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.207880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.208153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.209732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.210446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.211757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.213216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.213492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.213510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.231 [2024-05-14 12:07:21.216169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.217529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.218821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.220371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.220653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.221553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.223155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.224858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.226501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.226774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.226791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.230420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.231747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.233301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.234862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.235268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.236891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.238345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.239913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.241561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.242084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.242101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.245647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.247194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.248750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.249825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.250097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.251397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.252947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.254496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.255191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.255694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.255712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.259521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.261088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.262527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.263801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.264115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.265676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.267244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.268242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.268645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.269082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.269100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.272759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.273504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.274817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.276268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.276546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.277353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.278443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.278831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.279869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.280188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.280206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.283671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.284819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.286408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.287818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.288090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.288835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.290559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.290950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.291340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.291755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.291773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.295155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.296296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.297491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.298787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.299060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.300637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.301706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.302097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.302491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.302891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.302909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.305573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.305967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.306360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.306764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.307154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.307572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.307962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.308349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.308748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.309156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.309173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.312011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.312415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.312806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.313193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.232 [2024-05-14 12:07:21.313579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.313976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.314369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.314768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.315157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.315601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.315620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.318247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.318649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.319039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.319433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.495 [2024-05-14 12:07:21.319803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.320206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.320600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.321001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.321395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.321840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.321858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.324755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.325151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.325564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.325976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.326434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.326837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.327231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.327629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.328026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.328375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.328392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.331151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.331555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.331945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.332332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.332773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.333176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.333574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.333966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.334354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.334783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.334801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.337396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.337793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.338183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.338589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.338968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.339370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.339797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.340188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.340588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.341098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.341115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.343829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.344235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.344640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.345032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.345427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.345828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.346221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.346627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.347022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.347450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.347469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.350073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.350470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.350860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.351248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.351607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.352010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.352407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.352797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.353185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.353631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.353650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.356334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.356729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.357125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.357521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.357987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.358389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.358791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.359181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.359579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.360083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.360101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.362974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.363382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.363781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.364166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.364599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.365004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.365407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.365809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.366198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.366585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.366604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.369251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.369654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.370041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.370441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.370782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.371187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.371582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.371971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.372365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.372728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.372746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.375388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.496 [2024-05-14 12:07:21.375788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.376182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.376584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.377020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.377425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.377822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.378217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.378617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.379061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.379089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.381641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.383273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.383672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.384062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.384444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.384843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.385235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.385636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.386027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.386457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.386478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.389918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.390315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.390364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.391981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.392256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.393815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.395368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.396111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.397420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.397695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.397712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.400083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.400483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.401689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.401738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.402051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.403625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.405180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.406056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.407643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.407921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.407938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.409933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.409984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.410628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.411071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.411089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.412646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.412692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.412733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.412773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.413761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.415516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.415563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.415608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.415650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.416757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.418995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.419037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.419089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.419361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.419378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.421817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.422229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.497 [2024-05-14 12:07:21.422246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.424761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.425111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.425129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.426667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.426713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.426756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.426797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.427868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.430686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.431024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.431042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.432606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.432651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.432692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.432735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.433778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.435778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.435824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.435865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.435906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.436653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.438960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.439013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.439469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.439488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.441629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.441675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.441725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.441766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.442510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.444805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.445235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.445253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.447551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.447596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.447637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.447678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.447982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.448041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.448088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.498 [2024-05-14 12:07:21.448129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.448170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.448444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.448462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.450729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.451054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.451072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.453663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.453712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.453757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.453798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.454560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.456793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.457062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.457078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.459995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.460036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.460085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.460359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.460376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.462917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.465926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.466228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.466244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.467869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.467915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.467956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.467997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.468807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.470911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.470957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.470998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.471945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.473657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.473706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.473748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.473788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.499 [2024-05-14 12:07:21.474105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.474582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.476639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.476686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.476745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.476790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.477856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.479991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.480268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.480285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.482923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.483361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.483379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.484951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.484997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.485983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.487735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.487782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.488328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.488375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.488427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.488913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.490737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.490789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.490831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.490886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.493739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.498174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.498223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.501752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.501802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.501846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.511387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.513067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.513125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.513494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.513548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.500 [2024-05-14 12:07:21.513909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.514321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.514339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.517713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.518662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.520355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.521943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.523766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.524369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.524765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.525154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.525602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.525622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.528902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.529789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.531102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.532632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.534417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.534811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.535199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.535596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.536029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.536048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.538614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.540270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.541776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.543365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.544340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.544739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.545127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.545522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.545916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.545934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.548479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.549790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.551347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.552908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.553631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.554025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.554420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.554810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.555084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.555101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.558223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.559917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.561505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.562992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.563775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.564167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.564561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.565808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.566129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.566147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.569019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.570568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.572116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.572818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.573704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.574108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.574539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.575986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.576275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.501 [2024-05-14 12:07:21.576293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.579581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.581144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.582534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.582923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.583739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.584130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.585417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.586715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.586994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.587011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.590167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.591731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.592292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.592693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.593497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.594034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.595368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.596926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.597204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.597221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.600414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.601788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.602179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.602573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.603377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.604718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.606013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.607555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.607831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.607848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.610996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.611573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.611963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.612353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.613354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.614681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.616223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.617792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.618068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.618086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.621038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.621441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.621829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.622223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.623636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.625195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.626747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.627449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.627725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.627743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.629963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.630357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.631000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.632283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.634182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.635913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.636908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.638210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.638490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.638508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.640846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.641240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.642665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.643966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.645798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.646632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.648129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.649468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.649745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.649763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.652177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.652578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.652970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.653366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.654221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.654619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.655014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.655420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.655875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.655893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.658655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.659059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.659454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.764 [2024-05-14 12:07:21.659844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.660682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.661086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.661494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.661885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.662282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.662300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.664895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.665290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.665686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.666078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.666871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.667259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.667655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.668045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.668392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.668416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.671035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.671439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.671836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.672228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.673051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.673453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.673845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.674239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.674677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.674696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.677357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.677758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.678146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.678539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.679356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.679759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.680151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.680544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.680996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.681015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.683587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.683984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.684380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.684779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.685521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.685910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.686299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.686699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.687092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.687109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.689782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.690181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.690596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.690989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.691789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.692189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.692594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.693003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.693452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.693471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.696105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.696504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.696893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.697280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.698132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.698535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.698927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.699316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.699766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.699785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.702410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.702802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.703195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.703598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.704378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.704774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.705162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.705557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.705931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.705948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.708647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.709051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.709449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.709838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.710643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.711036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.711442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.711839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.712283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.712305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.715064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.715463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.715866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.716259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.717007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.717422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.717813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.718200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.765 [2024-05-14 12:07:21.718660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.718679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.721641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.722038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.722440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.722836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.723683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.724075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.724476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.724874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.725263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.725281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.728962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.729480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.731107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.731513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.732352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.732747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.733144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.733541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.733980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.733998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.736719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.737118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.737519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.737914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.738738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.739134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.740234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.741003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.741459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.741478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.744708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.746266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.747804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.748814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.749858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.751302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.752902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.754416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.754811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.754828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.758404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.759969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.761524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.762556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.764127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.765671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.767229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.767887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.768162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.768179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.770904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.770956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.772439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.774080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.775907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.776669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.777973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.779519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.779796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.779813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.782284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.783509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.784815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.784851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.786673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.786722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.787528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.787575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.787847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.787864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.789490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.789535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.789576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.789617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.790311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.790359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.791351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.791407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.791772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.791790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.793638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.793684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.793724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.793766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.794601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.796134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.796187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.766 [2024-05-14 12:07:21.796231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.796274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.796724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.796767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.796811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.796853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.797291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.797308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.799857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.800132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.800149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.801807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.801852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.801893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.801935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.802756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.805844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.806169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.806186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.807812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.807857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.807899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.807941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.808711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.810952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.810997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.811697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.812047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.812063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.813615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.813661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.813704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.813745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.814486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.816869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.817176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.817197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.819778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.820045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.820061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.821709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.821776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.821820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.821865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.822176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.822220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.822262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.822304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.767 [2024-05-14 12:07:21.822740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.822757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.825896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.827561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.827606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.827647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.827688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.827996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.828040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.828081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.828122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.828498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.828516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.830500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.830552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.830597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.830639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.831663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.833761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.834029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.834046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.835909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.835955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.835996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.836039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.836462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.836506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.836549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.836590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.837028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.837046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.838662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.838706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.838747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.838788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.839596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.841797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.842225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.842245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.844247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.768 [2024-05-14 12:07:21.844297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.844888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.845155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.845172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.846848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.846903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.846945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.846986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.847296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.847339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.847380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:54.769 [2024-05-14 12:07:21.847427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.847774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.847791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.850976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.852644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.852692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.852734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.852774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.853493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.855679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.855740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.855784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.855829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.856743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.858966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.859015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.859286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.859303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.860960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.861720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.862118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.862135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.863995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.864561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.865053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.865070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.866607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.866651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.866700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.866750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.867642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.869821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.869867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.869912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.869959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.870710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.872402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.032 [2024-05-14 12:07:21.872450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.872969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.873236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.873251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.875847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.875895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.875952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.875987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.876739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.878409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.878453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.878495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.878539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.880092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.880366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.880431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.881268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.881316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.882842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.883318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.883336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.887421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.889022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.889872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.891174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.891451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.891512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.893061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.893107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.894203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.894632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.894650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.898091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.899626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.901169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.901938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.902212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.903563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.905107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.906663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.907381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.907669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.907688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.911796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.913361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.914654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.916067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.916381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.917954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.919512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.920364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.920760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.921213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.921231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.925037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.926602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.927355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.928663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.928936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.930567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.932095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.933188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.933975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.934423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.934442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.940302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.941243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.942921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.944512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.944786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.946360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.946885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.947277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.947672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.948140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.948159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.951352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.952507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.953818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.955359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.955638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.956806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.958261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.958670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.959061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.959335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.033 [2024-05-14 12:07:21.959352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.963790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.965113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.966656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.968221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.968501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.968907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.969296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.969690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.970081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.970353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.970370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.973597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.975090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.976705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.978402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.978793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.980232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.980634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.981415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.982504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.982953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.982972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.988475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.990042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.991604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.992253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.992732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.993135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.993528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.994119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.995426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.995699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.995716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:21.998911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.000457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.001791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.002566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.002841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.003248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.003771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.005129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.005523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.005914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.005932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.011203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.011612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.012001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.012392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.012831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.014228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.015524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.017090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.018640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.019075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.019092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.021214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.022542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.022930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.023842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.024129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.024543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.025478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.026774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.028340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.028620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.028638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.033372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.033775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.034165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.034810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.035084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.036700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.038371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.039949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.040751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.041055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.041072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.044250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.044654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.045374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.046533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.046996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.047397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.047798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.048189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.048580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.049002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.049019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.054475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.054940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.055339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.055735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.056098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.056508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.056896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.057285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.034 [2024-05-14 12:07:22.057682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.058078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.058096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.061891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.062285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.062682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.063090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.063573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.063974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.064363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.064755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.065147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.065602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.065624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.070007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.070412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.070804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.071194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.071599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.071994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.072382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.072781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.074191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.074593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.074611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.077218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.077620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.078012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.078404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.078861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.079258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.079661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.080057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.081714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.082211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.082228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.085429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.085823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.086210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.086606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.087088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.087584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.088988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.089379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.090206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.090487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.090505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.093253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.093651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.094038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.094433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.094806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.095686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.096686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.097075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.098302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.098682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.098700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.102101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.102507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.102904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.104521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.104985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.105387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.106913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.107308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.107712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.108083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.108100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.110808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.111205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.111717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.113078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.113548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.035 [2024-05-14 12:07:22.113989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.115442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.115830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.116225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.116653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.116671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.120413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.121564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.121951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.298 [2024-05-14 12:07:22.123013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.123362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.123771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.124163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.124559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.124951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.125396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.125420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.128869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.129616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.130009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.131495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.131960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.132364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.132758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.133153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.133561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.133997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.134014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.137228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.138901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.139304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.139698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.140158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.140566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.140960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.141347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.141742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.142142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.142159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.144780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.146153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.146545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.146936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.147356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.147766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.148158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.148550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.148937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.149320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.149338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.152241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.152952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.154111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.155102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.155387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.155795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.156183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.156576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.156964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.157335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.157352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.160738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.161341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.161733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.162136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.162541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.163935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.164394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.164787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.166484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.166996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.167016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.172899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.173298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.174093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.175410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.175683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.177260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.178732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.179959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.181266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.181544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.181561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.183634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.185015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.185500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.185547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.185992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.187555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.299 [2024-05-14 12:07:22.188964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.190516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.192138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.192586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.192603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.197387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.198445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.199265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.199661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.199947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.201268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.202804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.204363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.205204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.205485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.205502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.208780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.209181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.209662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.209710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.209983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.210386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.210437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.211284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.211330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.211639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.211657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.216116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.216166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.216207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.216248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.216522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.217258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.217310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.218738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.218787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.219231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.219249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.221984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.222025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.222066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.222331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.222348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.226983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.227724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.228107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.228124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.230977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.231248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.231266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.235873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.300 [2024-05-14 12:07:22.235924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.235966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.236858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.239726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.240027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.240045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.244753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.244809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.244850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.244897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.245714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.247704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.247749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.247795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.247838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.248745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.252552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.252606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.252647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.252690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.252958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.253561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.255514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.255562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.255612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.255654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.256666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.260815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.260866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.260907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.260948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.261711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.263767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.263811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.263856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.263897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.264936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.269885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.301 [2024-05-14 12:07:22.269939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.269981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.270766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.272804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.272851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.272904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.272948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.273868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.277960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.278862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.280812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.280859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.280904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.280946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.281905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.285507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.285558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.285599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.285640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.285943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.286422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.288917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.289187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.289203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.292856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.292908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.292950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.292992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.293758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.295397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.295449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.295494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.295535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.295962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.302 [2024-05-14 12:07:22.296015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.296058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.296100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.296142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.296446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.296463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.300996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.301039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.301080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.301121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.301434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.301451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.303832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.304269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.304286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.309685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.309742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.309788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.309828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.310575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.312831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.313257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.313275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.317880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.318329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.318345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.319903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.319956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.319997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.320989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.321006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.323997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.303 [2024-05-14 12:07:22.324041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.324083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.324125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.324428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.324445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.326994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.327011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.329567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.329617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.329659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.329700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.329974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.330450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.332112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.332168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.332210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.333520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.333863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.333919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.333962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.334003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.334044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.334485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.334502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.337909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.342546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.342606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.342651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.343039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.343455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.343513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.344980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.345024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.345416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.345785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.345801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.351546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.353104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.354075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.355688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.356123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.356188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.356583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.356633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.358193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.358645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.358663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.363827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.365392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.366956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.368098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.368387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.369276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.369669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.304 [2024-05-14 12:07:22.370964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.371539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.371986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.372009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.378335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.379973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.305 [2024-05-14 12:07:22.381495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.382530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.382863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.383269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.384231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.385142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.385538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.385841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.385858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.391208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.565 [2024-05-14 12:07:22.392777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.393353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.394761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.395255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.395659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.397150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.397545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.398395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.398747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.398764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.404538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.405650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.407105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.407534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.407972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.409356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.409850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.410244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.411824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.412100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.412117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.418293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.419237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.420176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.420568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.420886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.421962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.422354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.423651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.424964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.425240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.425258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.430544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.432027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.432423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.433174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.433456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.433863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.434652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.435951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.437506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.437781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.437798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.443700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.444158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.444551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.446247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.446787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.447193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.448781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.450472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.452078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.452354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.452371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.457408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.457807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.459109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.459679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.460116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.461113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.462668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.464232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.464933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.465209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.465226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.471624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.472024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.472945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.474251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.474531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.476088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.477436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.478763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.480077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.480351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.480369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.484712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.485109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.486762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.488414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.488694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.490269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.490988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.492281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.493840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.494117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.494134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.498132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.499368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.500668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.502252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.502532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.503569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.504900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.506199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.507803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.508082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.508099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.511354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.511761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.512371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.513641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.514076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.514575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.515961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.516348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.516743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.517180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.517196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.521486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.521887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.522913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.523763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.524203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.525161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.526093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.526489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.526880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.527230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.527248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.532511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.532917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.534448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.534841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.535277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.536703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.537141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.537536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.537931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.538287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.566 [2024-05-14 12:07:22.538304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.544888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.545292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.546834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.547225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.547652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.549334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.549727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.550118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.550521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.550915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.550932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.556382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.557003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.558272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.558664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.559034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.560437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.560828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.561219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.561619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.562013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.562030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.566714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.567584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.568602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.568995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.569323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.570425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.570816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.571207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.571603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.571976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.571993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.576267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.577413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.578139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.578532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.578817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.579662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.580054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.580450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.580845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.581130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.581147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.584810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.586190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.586708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.587107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.587382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.587874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.588264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.588664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.589065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.589343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.589361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.592582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.594079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.594481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.594874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.595151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.595717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.596110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.596513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.596910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.597190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.597207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.600422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.602014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.602414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.602808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.603089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.603555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.603948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.604344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.604741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.605018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.605035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.608275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.609934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.610330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.610725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.611000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.611407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.611802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.612197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.612594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.612869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.612886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.616102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.617778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.618178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.618574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.618850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.619255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.619651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.620045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.620452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.620729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.620745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.623927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.625678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.626069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.626498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.626773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.627179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.627576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.627971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.628408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.628684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.628701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.631873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.633590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.633983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.634436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.634713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.635124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.635521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.637036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.637437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.637714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.637731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.641384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.641793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.642444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.643692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.644153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.644637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.646043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.646432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.646825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.647229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.567 [2024-05-14 12:07:22.647251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.651349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.651761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.652747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.653638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.654073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.654963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.655964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.656354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.656754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.657128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.657145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.663387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.664705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.666261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.667815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.668224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.669778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.671466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.673053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.674554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.674915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.674931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.680455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.682024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.683582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.830 [2024-05-14 12:07:22.683630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.683951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.685405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.686712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.688277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.689833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.690221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.690241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.697038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.698611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.700172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.701558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.701882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.703186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.704744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.706305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.707256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.707551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.707569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.711980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.713540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.715092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.715140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.715525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.717059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.717109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.718657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.718704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.718977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.718994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.722576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.722629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.722672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.722715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.723147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.724653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.724701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.726211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.726257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.726537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.726554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.731969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.732010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.732060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.732541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.732561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.735858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.736181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.736198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.741842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.742145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.742162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.746761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.747065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.747082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.751875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.752216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.752234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.831 [2024-05-14 12:07:22.755843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.755895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.755942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.755984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.756762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.761893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.762321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.762338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.767802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.768081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.768098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.771816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.772128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.772145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.776949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.777904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.781788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.781839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.781884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.781934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.782707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.787543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.787595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.787638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.787690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.787963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.788649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.791816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.791867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.791908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.791949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.792791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.797630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.797683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.797725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.797768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.798207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.798260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.798304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.832 [2024-05-14 12:07:22.798347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.798391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.798676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.798692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.802931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.803239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.803256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.808752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.808807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.808849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.808890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.809824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.814999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.815040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.815308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.815325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.819765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.820035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.820052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.824689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.824755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.824802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.824844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.825892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.829562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.829614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.829661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.829702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.829974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.830462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.834591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.834643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.834686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.834731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.835493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.840800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.841257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.841275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.844924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.844975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.845021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.833 [2024-05-14 12:07:22.845062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.845926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.849838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.849892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.849935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.849978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.850900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.855773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.856165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.856182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.859545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.859597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.859639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.859680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.859951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.860529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.865362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.865421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.865468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.865512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.865993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.866634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.871625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.871676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.871717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.871763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.872524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.875692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.875743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.875784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.877786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.878155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.878173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.883993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.884035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.884078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.884119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.884561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.884580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.889389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.889450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.889496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.891033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.891311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.891372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.892829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.892877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.893262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.893697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.893715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.834 [2024-05-14 12:07:22.898194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.899888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.901440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.903072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.903346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.903413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.903831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.903876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.904260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.904630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.904648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.909442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.910749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.912314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:55.835 [2024-05-14 12:07:22.913879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.914259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.914674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.915063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.915459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.916066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.916342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.916360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.922059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.923341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.923739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.924129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.924567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.924966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.096 [2024-05-14 12:07:22.926391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.927690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.929243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.929524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.929542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.933995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.934393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.934794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.935993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.936317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.937780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.939038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.940456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.941767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.942042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.942059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.946881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.948448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.950058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.950906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.951233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.952865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.954429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.955851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.956982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.957308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.957326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.961362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.962919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.963793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.965386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.965663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.967234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.968796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.969226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.969622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.970031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.970049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.974562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.975882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.977446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.979008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.979310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.979726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.980116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.980510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.980899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.981173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.981190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.986239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.987801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.988221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.988618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.989058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.989483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.989876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.990279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.990681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.991108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.991126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.994549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.994947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.995340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.995742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.996159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.996569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.996959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.997347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.997745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.998155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:22.998173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.001548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.001947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.002337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.002739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.003130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.003575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.003972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.004362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.004758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.005153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.005171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.008669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.009074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.009469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.009858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.010293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.010704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.011100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.011497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.011887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.012380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.012404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.015780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.016190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.016593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.097 [2024-05-14 12:07:23.016990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.017415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.017817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.018207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.018615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.019013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.019476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.019496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.023003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.023406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.023800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.024195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.024579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.024982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.025368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.025763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.026156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.026510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.026533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.029951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.030353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.030749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.031141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.031529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.031937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.032330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.032725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.033113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.033562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.033580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.036999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.037410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.037810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.038201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.038664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.039066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.039475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.039871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.040262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.040714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.040733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.043551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.043947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.044333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.044733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.045163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.045574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.045967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.046358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.046761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.047189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.047206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.049923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.050320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.050722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.051116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.051562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.051962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.052347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.052743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.053137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.053540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.053558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.056335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.056741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.057130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.057541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.057995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.058397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.058798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.059202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.059598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.060095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.060113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.062550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.062944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.063330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.063724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.064135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.064559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.064966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.065355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.065747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.066210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.066229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.068880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.069277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.069676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.070073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.070493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.070897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.071286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.071678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.072074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.072436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.072455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.098 [2024-05-14 12:07:23.076386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.077826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.079384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.081005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.081427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.082862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.084448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.086005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.087435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.087836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.087853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.091423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.092986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.094556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.095413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.095690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.097005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.098567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.100121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.100543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.100996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.101015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.104766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.106332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.107718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.109069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.109421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.110996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.112558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.113443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.113836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.114283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.114301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.118038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.119605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.120393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.121696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.121971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.123565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.125049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.125447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.125839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.126256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.126274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.129664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.130494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.132089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.133796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.134073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.135650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.136053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.136449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.136839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.137275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.137295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.140342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.141659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.142976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.144531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.144807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.145762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.146165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.146557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.146945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.147384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.147406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.149655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.150961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.152521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.154080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.154356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.154767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.155159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.155553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.155939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.156237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.156254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.159391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.160769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.162311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.163863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.164241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.164651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.165042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.165435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.166316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.166671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.166689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.169584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.171148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.172710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.173618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.174100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.174508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.174898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.175285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.099 [2024-05-14 12:07:23.176792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.100 [2024-05-14 12:07:23.177067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.100 [2024-05-14 12:07:23.177084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.100 [2024-05-14 12:07:23.180441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.182009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.183485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.183534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.183961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.184359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.184756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.185145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.186649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.186956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.186973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.190208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.191918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.193543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.193932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.194367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.194774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.195160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.362 [2024-05-14 12:07:23.196311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.197601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.197875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.197892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.201007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.202557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.203293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.203340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.203821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.204223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.204269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.204663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.204709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.205094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.205112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.206660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.206706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.206748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.206790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.207069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.208349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.208402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.209934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.209979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.210251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.210268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.212760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.212809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.212851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.212895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.213697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.215959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.216229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.216246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.218503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.218549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.218596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.218639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.219602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.221889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.222160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.222177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.224994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.225037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.225307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.225328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.227931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.229913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.229959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.230005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.230067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.363 [2024-05-14 12:07:23.230542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.230596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.230639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.230681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.230723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.231141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.231158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.232727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.232779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.232822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.232864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.233637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.235567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.235613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.235656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.235699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.236706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.238966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.239007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.239309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.239326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.241842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.242323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.242340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.244952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.246641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.246691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.246732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.246774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.247846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.249796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.249841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.249882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.249926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.250803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.252963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.253006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.253049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.253091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.253540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.253559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.255662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.255707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.364 [2024-05-14 12:07:23.255755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.255801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.256553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.258901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.259334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.259352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.261963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.262004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.262045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.262310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.262327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.263958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.264972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.267770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.268041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.268058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.269743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.269787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.269828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.269869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.270724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.273813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.274080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.274097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.275755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.275800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.275845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.275900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.276646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.365 [2024-05-14 12:07:23.279882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.279898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.281634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.281679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.281721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.281761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.282509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.284804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.284850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.284895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.284938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.285835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.287968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.288018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.288064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.288333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.288350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.290459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.290504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.290549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.290591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.291501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.293133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.293177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.293219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.294966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.295235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.295251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.297477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.297523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.297565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.297607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.298548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.300222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.300267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.300308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.301849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.302173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.302237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.302637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.302683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.303067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.303554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.303573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.306803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.307532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.308823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.310373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.310651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.310717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.312058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.312104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.312497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.312935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.312953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.316457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.318017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.318756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.320171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.320449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.321995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.366 [2024-05-14 12:07:23.323569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.323966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.324361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.324741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.324760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.328159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.367 [2024-05-14 12:07:23.329587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.330877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.332191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.332468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.334044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.335006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.335417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.335806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.336330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.336349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.339601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.340045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.341606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.343324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.343603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.345176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.345578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.345968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.346355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.346811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.346830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.349472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.349861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.350253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.350653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.351095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.351505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.351892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.352276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.352669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.353023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.353041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.356020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.356422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.356826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.357215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.357712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.358114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.358513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.358909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.359301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.359732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.359751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.362470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.362871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.363258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.363653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.364083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.364506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.364898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.365284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.365678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.366024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.366042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.368652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.369049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.369446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.369843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.370268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.370672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.371060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.371456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.371855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.372284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.372302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.374994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.375408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.375797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.376185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.376610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.377014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.377417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.377807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.378191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.368 [2024-05-14 12:07:23.378641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.378660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.381375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.381773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.382164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.382567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.382925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.383326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.383720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.384110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.384508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.384887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.384905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.387701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.388106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.388502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.388889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.389292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.389698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.390091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.390492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.390894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.391328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.391348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.394035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.394440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.394828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.395216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.395639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.396046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.396440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.396827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.397219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.397642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.397660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.400345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.400748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.401144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.401539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.401988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.402386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.402781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.403168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.403569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.404006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.404023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.406732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.407128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.407527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.407932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.408368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.408774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.409170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.409582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.409969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.410468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.410486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.413111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.413508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.413899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.414298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.414685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.415091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.415485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.415872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.416288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.416567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.416585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.419039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.419477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.419872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.420269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.420709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.421113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.421504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.421896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.422292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.422735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.422754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.425624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.426024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.426421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.426808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.427260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.427668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.428080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.428494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.428886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.429278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.429296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.431916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.432311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.432707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.433814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.434161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.369 [2024-05-14 12:07:23.435722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.437278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.438450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.439947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.440254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.440270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.442455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.442863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.443343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.444732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.370 [2024-05-14 12:07:23.445008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.446592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.448247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.449155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.450453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.450727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.450743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.453017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.453420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.454681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.455986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.456257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.457827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.458823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.460540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.462108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.462381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.462402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.464934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.465663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.466969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.468518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.468789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.470322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.471504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.472810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.474374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.630 [2024-05-14 12:07:23.474651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.474668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.477224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.478712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.480065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.481629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.481901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.482685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.484208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.485910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.487513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.487786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.487803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.490985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.492307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.493863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.495418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.495738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.497036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.498342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.499898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.501448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.501810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.501827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.505964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.507426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.509006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.510663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.511145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.512574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.514138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.515697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.517059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.517509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.517532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.521099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.522643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.524193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.525030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.525303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.526603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.528140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.529682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.530156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.530614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.530635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.534201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.535791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.537463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.538380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.538698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.540257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.541820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.543163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.543562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.544004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.544021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.547479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.549042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.549868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.551465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.551742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.553313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.554866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.555268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.555669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.556069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.556088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.559611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.561334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.562368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.563674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.563947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.565509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.566769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.567160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.567558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.568021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.568038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.571359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.572097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.573592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.575271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.575550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.577118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.577520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.577913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.578303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.578768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.578787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.582148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.583160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.584472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.631 [2024-05-14 12:07:23.584519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.584792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.586371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.587416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.587806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.588194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.588654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.588673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.591898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.592621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.593960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.595514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.595787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.597464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.597864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.598255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.598650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.599087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.599105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.602190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.603429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.604729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.604776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.605048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.606615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.606663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.607359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.607409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.607871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.607889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.609982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.610033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.610074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.610120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.610392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.611959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.612007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.613242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.613288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.613567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.613585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.615982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.616024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.616406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.616425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.618969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.619010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.619413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.619431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.620976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.621680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.622111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.632 [2024-05-14 12:07:23.622129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.624952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.626645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.626690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.626732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.626773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.627824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.629942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.629988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.630834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.632979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.633021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.633062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.633111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.633489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.633507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.635715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.635759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.635800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.635841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.636656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.638939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.639218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.639236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.641980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.642021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.642068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.642113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.642386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.642408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.644990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.633 [2024-05-14 12:07:23.647821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.647863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.647904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.647944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.648250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.648267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.649885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.649930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.649972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.650805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.652895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.652941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.652986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.653944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.655575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.655620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.655669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.655710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.655980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.656473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.658992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.659037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.659079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.659122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.659553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.659574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.661781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.662084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.662100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.663880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.663926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.663968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.664677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.665111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.665129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.666820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.666868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.666913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.666955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.667761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.669388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.669438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.669480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.669521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.669958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.670595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.672475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.634 [2024-05-14 12:07:23.672520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.672561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.672602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.672872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.672930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.672972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.673012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.673061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.673411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.673429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.674947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.674996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.675667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.676087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.676105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.678742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.679100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.679117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.680908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.680954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.680996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.681691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.682159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.682176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.683897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.683950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.683991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.684792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.686445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.686505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.686548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.686589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.687614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.689973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.690016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.690058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.690098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.690555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.690572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.692097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.692148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.692198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.635 [2024-05-14 12:07:23.692243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.692619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.692673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.692715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.692757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.692799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.693246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.693266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.695292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.695336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.695377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.696930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.697819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.699964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.700010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.700053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.700096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.700550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.700568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.702635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.702680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.702731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.704273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.704551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.704611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.705625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.705671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.706978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.707254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.707271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.709800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.710193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.711798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.713475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.713751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.636 [2024-05-14 12:07:23.713812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.715464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.715543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.716263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.716551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.716569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.718610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.719003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.719392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.719791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.720186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.720614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.721006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.721404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.721792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.722142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.722160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.724871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.725270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.725671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.726063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.726518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.726918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.727309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.727708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.728103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.728557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.728575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.731239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.731641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.732029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.732422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.732857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.733257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.733669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.734065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.734460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.734935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.734953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.737640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.738032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.738427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.738828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.739274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.739684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.740075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.740472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.740864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.741333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.741350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.744158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.744565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.744963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.745355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.745733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.746133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.746538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.746934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.747326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.747777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.747796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.750375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.750783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.751177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.751574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.752007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.752419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.752814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.753204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.753594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.754076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.754094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.756770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.757168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.757569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.757963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.758394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.758799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.759186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.759580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.759972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.760363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.760380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.763050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.763455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.763848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.896 [2024-05-14 12:07:23.764236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.764724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.765123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.765517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.765914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.766314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.766761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.766779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.769515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.769907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.770296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.770694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.771112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.771516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.771913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.772305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.772701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.773156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.773173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.775871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.776269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.776669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.777062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.777516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.777918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.778307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.778702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.779099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.779499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.779517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.782211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.782618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.783012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.783406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.783836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.784239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.784640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.785031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.785427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.785889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.785906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.788615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.789019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.789416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.791078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.791478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.793055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.793456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.793849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.794242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.794665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.794683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.797362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.797765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.798158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.798558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.798947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.799347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.799741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.800130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.800524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.800897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.800914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.803674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.804070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.804472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.804867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.805295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.805703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.806097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.806495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.806902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.807347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.807367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.810875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.812440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.813180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.814484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.814761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.816418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.818157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.818555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.818951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.819316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.819334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.822768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.823964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.825470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.826818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.827094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.828672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.829411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.829807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.830198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.830730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.897 [2024-05-14 12:07:23.830749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.833958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.834672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.836021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.837588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.837864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.839267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.839664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.840055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.840450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.840910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:56.898 [2024-05-14 12:07:23.840928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:30:57.465 00:30:57.465 Latency(us) 00:30:57.465 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:57.465 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x0 length 0x100 00:30:57.465 crypto_ram : 6.00 42.67 2.67 0.00 0.00 2907565.19 300895.72 2494699.07 00:30:57.465 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x100 length 0x100 00:30:57.465 crypto_ram : 6.07 42.17 2.64 0.00 0.00 2949953.45 282659.62 2655176.79 00:30:57.465 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x0 length 0x100 00:30:57.465 crypto_ram1 : 6.00 42.66 2.67 0.00 0.00 2805143.82 300895.72 2290454.71 00:30:57.465 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x100 length 0x100 00:30:57.465 crypto_ram1 : 6.07 42.16 2.63 0.00 0.00 2846064.64 280836.01 2450932.42 00:30:57.465 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x0 length 0x100 00:30:57.465 crypto_ram2 : 5.57 277.90 17.37 0.00 0.00 411265.79 8605.16 667441.42 00:30:57.465 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x100 length 0x100 00:30:57.465 crypto_ram2 : 5.60 264.54 16.53 0.00 0.00 431480.04 49693.38 685677.52 00:30:57.465 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x0 length 0x100 00:30:57.465 crypto_ram3 : 5.70 291.51 18.22 0.00 0.00 381109.22 67017.68 477785.93 00:30:57.465 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:57.465 Verification LBA range: start 0x100 length 0x100 00:30:57.465 crypto_ram3 : 5.73 275.13 17.20 0.00 0.00 401555.74 24504.77 373840.14 00:30:57.465 =================================================================================================================== 00:30:57.465 Total : 1278.73 79.92 0.00 0.00 752630.41 8605.16 2655176.79 00:30:58.034 00:30:58.034 real 0m9.246s 00:30:58.034 user 0m17.525s 00:30:58.034 sys 0m0.475s 00:30:58.034 12:07:24 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:58.034 12:07:24 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:30:58.034 ************************************ 00:30:58.034 END TEST bdev_verify_big_io 00:30:58.034 ************************************ 00:30:58.034 12:07:24 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:58.034 12:07:24 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:30:58.034 12:07:24 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:30:58.034 12:07:24 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:58.034 ************************************ 00:30:58.034 START TEST bdev_write_zeroes 00:30:58.034 ************************************ 00:30:58.034 12:07:24 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:58.034 [2024-05-14 12:07:25.015407] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:30:58.034 [2024-05-14 12:07:25.015465] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1850101 ] 00:30:58.293 [2024-05-14 12:07:25.145539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:58.293 [2024-05-14 12:07:25.245849] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:58.293 [2024-05-14 12:07:25.267141] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:58.293 [2024-05-14 12:07:25.275163] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:58.293 [2024-05-14 12:07:25.283180] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:58.552 [2024-05-14 12:07:25.396331] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:01.085 [2024-05-14 12:07:27.606474] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:01.085 [2024-05-14 12:07:27.606547] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:01.085 [2024-05-14 12:07:27.606562] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:01.085 [2024-05-14 12:07:27.614494] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:01.085 [2024-05-14 12:07:27.614513] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:01.085 [2024-05-14 12:07:27.614525] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:01.085 [2024-05-14 12:07:27.622515] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:01.085 [2024-05-14 12:07:27.622532] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:01.085 [2024-05-14 12:07:27.622544] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:01.085 [2024-05-14 12:07:27.630542] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:01.085 [2024-05-14 12:07:27.630561] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:01.085 [2024-05-14 12:07:27.630573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:01.085 Running I/O for 1 seconds... 00:31:02.023 00:31:02.023 Latency(us) 00:31:02.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:02.023 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:02.023 crypto_ram : 1.02 2023.15 7.90 0.00 0.00 62717.21 5584.81 76135.74 00:31:02.023 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:02.023 crypto_ram1 : 1.03 2036.26 7.95 0.00 0.00 62017.23 5584.81 70664.90 00:31:02.023 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:02.023 crypto_ram2 : 1.02 15580.37 60.86 0.00 0.00 8087.43 2436.23 10656.72 00:31:02.023 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:02.023 crypto_ram3 : 1.02 15612.33 60.99 0.00 0.00 8044.88 2421.98 8548.17 00:31:02.023 =================================================================================================================== 00:31:02.023 Total : 35252.11 137.70 0.00 0.00 14346.33 2421.98 76135.74 00:31:02.282 00:31:02.282 real 0m4.218s 00:31:02.282 user 0m3.796s 00:31:02.282 sys 0m0.380s 00:31:02.282 12:07:29 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:02.282 12:07:29 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:02.282 ************************************ 00:31:02.282 END TEST bdev_write_zeroes 00:31:02.282 ************************************ 00:31:02.282 12:07:29 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:02.282 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:31:02.282 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:02.282 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:02.282 ************************************ 00:31:02.282 START TEST bdev_json_nonenclosed 00:31:02.282 ************************************ 00:31:02.282 12:07:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:02.282 [2024-05-14 12:07:29.332630] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:02.283 [2024-05-14 12:07:29.332696] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1850649 ] 00:31:02.542 [2024-05-14 12:07:29.465643] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:02.542 [2024-05-14 12:07:29.570017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:02.542 [2024-05-14 12:07:29.570094] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:02.542 [2024-05-14 12:07:29.570114] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:02.542 [2024-05-14 12:07:29.570127] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:02.801 00:31:02.801 real 0m0.407s 00:31:02.801 user 0m0.233s 00:31:02.801 sys 0m0.170s 00:31:02.801 12:07:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:02.801 12:07:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:02.801 ************************************ 00:31:02.801 END TEST bdev_json_nonenclosed 00:31:02.801 ************************************ 00:31:02.801 12:07:29 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:02.801 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:31:02.801 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:02.801 12:07:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:02.802 ************************************ 00:31:02.802 START TEST bdev_json_nonarray 00:31:02.802 ************************************ 00:31:02.802 12:07:29 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:02.802 [2024-05-14 12:07:29.823246] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:02.802 [2024-05-14 12:07:29.823306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1850692 ] 00:31:03.060 [2024-05-14 12:07:29.952262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:03.060 [2024-05-14 12:07:30.063261] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:03.060 [2024-05-14 12:07:30.063337] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:03.060 [2024-05-14 12:07:30.063358] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:03.060 [2024-05-14 12:07:30.063370] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:03.320 00:31:03.320 real 0m0.406s 00:31:03.320 user 0m0.253s 00:31:03.320 sys 0m0.149s 00:31:03.320 12:07:30 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:03.320 12:07:30 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:03.320 ************************************ 00:31:03.320 END TEST bdev_json_nonarray 00:31:03.320 ************************************ 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:31:03.320 12:07:30 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:31:03.320 00:31:03.320 real 1m12.178s 00:31:03.320 user 2m40.227s 00:31:03.320 sys 0m9.079s 00:31:03.320 12:07:30 blockdev_crypto_qat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:03.320 12:07:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:03.320 ************************************ 00:31:03.320 END TEST blockdev_crypto_qat 00:31:03.320 ************************************ 00:31:03.320 12:07:30 -- spdk/autotest.sh@356 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:03.320 12:07:30 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:31:03.320 12:07:30 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:31:03.320 12:07:30 -- common/autotest_common.sh@10 -- # set +x 00:31:03.320 ************************************ 00:31:03.320 START TEST chaining 00:31:03.320 ************************************ 00:31:03.320 12:07:30 chaining -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:03.580 * Looking for test storage... 00:31:03.580 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@7 -- # uname -s 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:03.580 12:07:30 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:03.580 12:07:30 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:03.580 12:07:30 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:03.580 12:07:30 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:03.580 12:07:30 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:03.580 12:07:30 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:03.580 12:07:30 chaining -- paths/export.sh@5 -- # export PATH 00:31:03.580 12:07:30 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@47 -- # : 0 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:31:03.580 12:07:30 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:03.580 12:07:30 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:03.580 12:07:30 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:03.580 12:07:30 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:03.580 12:07:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@336 -- # return 1 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:31:11.702 WARNING: No supported devices were found, fallback requested for tcp test 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:31:11.702 Cannot find device "nvmf_tgt_br" 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@155 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:31:11.702 Cannot find device "nvmf_tgt_br2" 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@156 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:31:11.702 Cannot find device "nvmf_tgt_br" 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@158 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:31:11.702 Cannot find device "nvmf_tgt_br2" 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@159 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:31:11.702 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@162 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:31:11.702 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@163 -- # true 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:31:11.702 12:07:38 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:31:11.962 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:11.962 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.103 ms 00:31:11.962 00:31:11.962 --- 10.0.0.2 ping statistics --- 00:31:11.962 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:11.962 rtt min/avg/max/mdev = 0.103/0.103/0.103/0.000 ms 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:31:11.962 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:31:11.962 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.066 ms 00:31:11.962 00:31:11.962 --- 10.0.0.3 ping statistics --- 00:31:11.962 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:11.962 rtt min/avg/max/mdev = 0.066/0.066/0.066/0.000 ms 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:31:11.962 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:11.962 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.058 ms 00:31:11.962 00:31:11.962 --- 10.0.0.1 ping statistics --- 00:31:11.962 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:11.962 rtt min/avg/max/mdev = 0.058/0.058/0.058/0.000 ms 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@433 -- # return 0 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:11.962 12:07:38 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@481 -- # nvmfpid=1854524 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:11.962 12:07:38 chaining -- nvmf/common.sh@482 -- # waitforlisten 1854524 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@827 -- # '[' -z 1854524 ']' 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:11.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:11.962 12:07:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:11.962 [2024-05-14 12:07:38.957296] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:11.962 [2024-05-14 12:07:38.957383] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:12.221 [2024-05-14 12:07:39.104562] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:12.221 [2024-05-14 12:07:39.234616] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:12.221 [2024-05-14 12:07:39.234674] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:12.221 [2024-05-14 12:07:39.234694] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:12.221 [2024-05-14 12:07:39.234711] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:12.221 [2024-05-14 12:07:39.234726] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:12.221 [2024-05-14 12:07:39.234769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:12.790 12:07:39 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:12.790 12:07:39 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.hzrMB2HB0j 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@69 -- # mktemp 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.SsT7LUrtHa 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:12.790 12:07:39 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:12.790 12:07:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:12.790 malloc0 00:31:12.790 true 00:31:12.790 true 00:31:12.790 [2024-05-14 12:07:39.855051] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:12.790 crypto0 00:31:12.790 [2024-05-14 12:07:39.863066] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:12.790 crypto1 00:31:12.790 [2024-05-14 12:07:39.871185] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:13.050 [2024-05-14 12:07:39.887162] nvmf_rpc.c: 610:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:31:13.050 [2024-05-14 12:07:39.887548] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@85 -- # update_stats 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.050 12:07:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:31:13.050 12:07:39 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.050 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.hzrMB2HB0j bs=1K count=64 00:31:13.050 64+0 records in 00:31:13.050 64+0 records out 00:31:13.050 65536 bytes (66 kB, 64 KiB) copied, 0.000716005 s, 91.5 MB/s 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.hzrMB2HB0j --ob Nvme0n1 --bs 65536 --count 1 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@25 -- # local config 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:13.050 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:13.050 "subsystems": [ 00:31:13.050 { 00:31:13.050 "subsystem": "bdev", 00:31:13.050 "config": [ 00:31:13.050 { 00:31:13.050 "method": "bdev_nvme_attach_controller", 00:31:13.050 "params": { 00:31:13.050 "trtype": "tcp", 00:31:13.050 "adrfam": "IPv4", 00:31:13.050 "name": "Nvme0", 00:31:13.050 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:13.050 "traddr": "10.0.0.2", 00:31:13.050 "trsvcid": "4420" 00:31:13.050 } 00:31:13.050 }, 00:31:13.050 { 00:31:13.050 "method": "bdev_set_options", 00:31:13.050 "params": { 00:31:13.050 "bdev_auto_examine": false 00:31:13.050 } 00:31:13.050 } 00:31:13.050 ] 00:31:13.050 } 00:31:13.050 ] 00:31:13.050 }' 00:31:13.050 12:07:40 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:13.050 "subsystems": [ 00:31:13.050 { 00:31:13.050 "subsystem": "bdev", 00:31:13.050 "config": [ 00:31:13.050 { 00:31:13.050 "method": "bdev_nvme_attach_controller", 00:31:13.050 "params": { 00:31:13.050 "trtype": "tcp", 00:31:13.050 "adrfam": "IPv4", 00:31:13.050 "name": "Nvme0", 00:31:13.050 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:13.050 "traddr": "10.0.0.2", 00:31:13.050 "trsvcid": "4420" 00:31:13.050 } 00:31:13.050 }, 00:31:13.050 { 00:31:13.050 "method": "bdev_set_options", 00:31:13.050 "params": { 00:31:13.050 "bdev_auto_examine": false 00:31:13.050 } 00:31:13.050 } 00:31:13.051 ] 00:31:13.051 } 00:31:13.051 ] 00:31:13.051 }' 00:31:13.051 12:07:40 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.hzrMB2HB0j --ob Nvme0n1 --bs 65536 --count 1 00:31:13.310 [2024-05-14 12:07:40.188959] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:13.310 [2024-05-14 12:07:40.189027] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1854741 ] 00:31:13.310 [2024-05-14 12:07:40.317945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:13.572 [2024-05-14 12:07:40.417712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:13.832  Copying: 64/64 [kB] (average 20 MBps) 00:31:13.832 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:13.832 12:07:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:13.832 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.092 12:07:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@96 -- # update_stats 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:14.092 12:07:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.092 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:14.351 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.SsT7LUrtHa --ib Nvme0n1 --bs 65536 --count 1 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@25 -- # local config 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:14.351 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:14.351 "subsystems": [ 00:31:14.351 { 00:31:14.351 "subsystem": "bdev", 00:31:14.351 "config": [ 00:31:14.351 { 00:31:14.351 "method": "bdev_nvme_attach_controller", 00:31:14.351 "params": { 00:31:14.351 "trtype": "tcp", 00:31:14.351 "adrfam": "IPv4", 00:31:14.351 "name": "Nvme0", 00:31:14.351 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:14.351 "traddr": "10.0.0.2", 00:31:14.351 "trsvcid": "4420" 00:31:14.351 } 00:31:14.351 }, 00:31:14.351 { 00:31:14.351 "method": "bdev_set_options", 00:31:14.351 "params": { 00:31:14.351 "bdev_auto_examine": false 00:31:14.351 } 00:31:14.351 } 00:31:14.351 ] 00:31:14.351 } 00:31:14.351 ] 00:31:14.351 }' 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.SsT7LUrtHa --ib Nvme0n1 --bs 65536 --count 1 00:31:14.351 12:07:41 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:14.351 "subsystems": [ 00:31:14.351 { 00:31:14.351 "subsystem": "bdev", 00:31:14.351 "config": [ 00:31:14.351 { 00:31:14.351 "method": "bdev_nvme_attach_controller", 00:31:14.351 "params": { 00:31:14.351 "trtype": "tcp", 00:31:14.351 "adrfam": "IPv4", 00:31:14.351 "name": "Nvme0", 00:31:14.351 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:14.351 "traddr": "10.0.0.2", 00:31:14.351 "trsvcid": "4420" 00:31:14.351 } 00:31:14.351 }, 00:31:14.351 { 00:31:14.351 "method": "bdev_set_options", 00:31:14.351 "params": { 00:31:14.351 "bdev_auto_examine": false 00:31:14.351 } 00:31:14.351 } 00:31:14.351 ] 00:31:14.351 } 00:31:14.351 ] 00:31:14.351 }' 00:31:14.351 [2024-05-14 12:07:41.308616] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:14.351 [2024-05-14 12:07:41.308683] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1854948 ] 00:31:14.610 [2024-05-14 12:07:41.438585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:14.610 [2024-05-14 12:07:41.537532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:14.869  Copying: 64/64 [kB] (average 20 MBps) 00:31:14.869 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:14.869 12:07:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:14.869 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:14.869 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.128 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:15.128 12:07:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.129 12:07:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.129 12:07:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:15.129 12:07:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.129 12:07:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.129 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.hzrMB2HB0j /tmp/tmp.SsT7LUrtHa 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@25 -- # local config 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:15.129 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:15.129 "subsystems": [ 00:31:15.129 { 00:31:15.129 "subsystem": "bdev", 00:31:15.129 "config": [ 00:31:15.129 { 00:31:15.129 "method": "bdev_nvme_attach_controller", 00:31:15.129 "params": { 00:31:15.129 "trtype": "tcp", 00:31:15.129 "adrfam": "IPv4", 00:31:15.129 "name": "Nvme0", 00:31:15.129 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:15.129 "traddr": "10.0.0.2", 00:31:15.129 "trsvcid": "4420" 00:31:15.129 } 00:31:15.129 }, 00:31:15.129 { 00:31:15.129 "method": "bdev_set_options", 00:31:15.129 "params": { 00:31:15.129 "bdev_auto_examine": false 00:31:15.129 } 00:31:15.129 } 00:31:15.129 ] 00:31:15.129 } 00:31:15.129 ] 00:31:15.129 }' 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:31:15.129 12:07:42 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:15.129 "subsystems": [ 00:31:15.129 { 00:31:15.129 "subsystem": "bdev", 00:31:15.129 "config": [ 00:31:15.129 { 00:31:15.129 "method": "bdev_nvme_attach_controller", 00:31:15.129 "params": { 00:31:15.129 "trtype": "tcp", 00:31:15.129 "adrfam": "IPv4", 00:31:15.129 "name": "Nvme0", 00:31:15.129 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:15.129 "traddr": "10.0.0.2", 00:31:15.129 "trsvcid": "4420" 00:31:15.129 } 00:31:15.129 }, 00:31:15.129 { 00:31:15.129 "method": "bdev_set_options", 00:31:15.129 "params": { 00:31:15.129 "bdev_auto_examine": false 00:31:15.129 } 00:31:15.129 } 00:31:15.129 ] 00:31:15.129 } 00:31:15.129 ] 00:31:15.129 }' 00:31:15.129 [2024-05-14 12:07:42.200646] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:15.129 [2024-05-14 12:07:42.200698] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1854988 ] 00:31:15.388 [2024-05-14 12:07:42.319105] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.388 [2024-05-14 12:07:42.434582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.906  Copying: 64/64 [kB] (average 12 MBps) 00:31:15.906 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@106 -- # update_stats 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:15.906 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:15.906 12:07:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:15.907 12:07:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:15.907 12:07:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.165 12:07:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.hzrMB2HB0j --ob Nvme0n1 --bs 4096 --count 16 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@25 -- # local config 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:16.166 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:16.166 "subsystems": [ 00:31:16.166 { 00:31:16.166 "subsystem": "bdev", 00:31:16.166 "config": [ 00:31:16.166 { 00:31:16.166 "method": "bdev_nvme_attach_controller", 00:31:16.166 "params": { 00:31:16.166 "trtype": "tcp", 00:31:16.166 "adrfam": "IPv4", 00:31:16.166 "name": "Nvme0", 00:31:16.166 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:16.166 "traddr": "10.0.0.2", 00:31:16.166 "trsvcid": "4420" 00:31:16.166 } 00:31:16.166 }, 00:31:16.166 { 00:31:16.166 "method": "bdev_set_options", 00:31:16.166 "params": { 00:31:16.166 "bdev_auto_examine": false 00:31:16.166 } 00:31:16.166 } 00:31:16.166 ] 00:31:16.166 } 00:31:16.166 ] 00:31:16.166 }' 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.hzrMB2HB0j --ob Nvme0n1 --bs 4096 --count 16 00:31:16.166 12:07:43 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:16.166 "subsystems": [ 00:31:16.166 { 00:31:16.166 "subsystem": "bdev", 00:31:16.166 "config": [ 00:31:16.166 { 00:31:16.166 "method": "bdev_nvme_attach_controller", 00:31:16.166 "params": { 00:31:16.166 "trtype": "tcp", 00:31:16.166 "adrfam": "IPv4", 00:31:16.166 "name": "Nvme0", 00:31:16.166 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:16.166 "traddr": "10.0.0.2", 00:31:16.166 "trsvcid": "4420" 00:31:16.166 } 00:31:16.166 }, 00:31:16.166 { 00:31:16.166 "method": "bdev_set_options", 00:31:16.166 "params": { 00:31:16.166 "bdev_auto_examine": false 00:31:16.166 } 00:31:16.166 } 00:31:16.166 ] 00:31:16.166 } 00:31:16.166 ] 00:31:16.166 }' 00:31:16.166 [2024-05-14 12:07:43.118617] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:16.166 [2024-05-14 12:07:43.118685] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1855181 ] 00:31:16.166 [2024-05-14 12:07:43.249329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:16.425 [2024-05-14 12:07:43.348974] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:16.944  Copying: 64/64 [kB] (average 10 MBps) 00:31:16.944 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@114 -- # update_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:16.944 12:07:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:16.944 12:07:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:16.944 12:07:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:17.204 12:07:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:17.204 12:07:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:17.205 12:07:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@117 -- # : 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.SsT7LUrtHa --ib Nvme0n1 --bs 4096 --count 16 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@25 -- # local config 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:31:17.205 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@31 -- # config='{ 00:31:17.205 "subsystems": [ 00:31:17.205 { 00:31:17.205 "subsystem": "bdev", 00:31:17.205 "config": [ 00:31:17.205 { 00:31:17.205 "method": "bdev_nvme_attach_controller", 00:31:17.205 "params": { 00:31:17.205 "trtype": "tcp", 00:31:17.205 "adrfam": "IPv4", 00:31:17.205 "name": "Nvme0", 00:31:17.205 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:17.205 "traddr": "10.0.0.2", 00:31:17.205 "trsvcid": "4420" 00:31:17.205 } 00:31:17.205 }, 00:31:17.205 { 00:31:17.205 "method": "bdev_set_options", 00:31:17.205 "params": { 00:31:17.205 "bdev_auto_examine": false 00:31:17.205 } 00:31:17.205 } 00:31:17.205 ] 00:31:17.205 } 00:31:17.205 ] 00:31:17.205 }' 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:31:17.205 "subsystems": [ 00:31:17.205 { 00:31:17.205 "subsystem": "bdev", 00:31:17.205 "config": [ 00:31:17.205 { 00:31:17.205 "method": "bdev_nvme_attach_controller", 00:31:17.205 "params": { 00:31:17.205 "trtype": "tcp", 00:31:17.205 "adrfam": "IPv4", 00:31:17.205 "name": "Nvme0", 00:31:17.205 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:31:17.205 "traddr": "10.0.0.2", 00:31:17.205 "trsvcid": "4420" 00:31:17.205 } 00:31:17.205 }, 00:31:17.205 { 00:31:17.205 "method": "bdev_set_options", 00:31:17.205 "params": { 00:31:17.205 "bdev_auto_examine": false 00:31:17.205 } 00:31:17.205 } 00:31:17.205 ] 00:31:17.205 } 00:31:17.205 ] 00:31:17.205 }' 00:31:17.205 12:07:44 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.SsT7LUrtHa --ib Nvme0n1 --bs 4096 --count 16 00:31:17.205 [2024-05-14 12:07:44.290641] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:17.205 [2024-05-14 12:07:44.290713] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1855391 ] 00:31:17.464 [2024-05-14 12:07:44.420629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:17.464 [2024-05-14 12:07:44.526980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:17.983  Copying: 64/64 [kB] (average 1306 kBps) 00:31:17.983 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:17.983 12:07:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:17.983 12:07:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:17.983 12:07:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:17.983 12:07:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:17.984 12:07:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:17.984 12:07:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:17.984 12:07:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:17.984 12:07:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.hzrMB2HB0j /tmp/tmp.SsT7LUrtHa 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.hzrMB2HB0j /tmp/tmp.SsT7LUrtHa 00:31:18.242 12:07:45 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@117 -- # sync 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@120 -- # set +e 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:31:18.242 rmmod nvme_tcp 00:31:18.242 rmmod nvme_fabrics 00:31:18.242 rmmod nvme_keyring 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@124 -- # set -e 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@125 -- # return 0 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@489 -- # '[' -n 1854524 ']' 00:31:18.242 12:07:45 chaining -- nvmf/common.sh@490 -- # killprocess 1854524 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@946 -- # '[' -z 1854524 ']' 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@950 -- # kill -0 1854524 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@951 -- # uname 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1854524 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1854524' 00:31:18.242 killing process with pid 1854524 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@965 -- # kill 1854524 00:31:18.242 [2024-05-14 12:07:45.327482] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:31:18.242 12:07:45 chaining -- common/autotest_common.sh@970 -- # wait 1854524 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:31:18.501 12:07:45 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:18.501 12:07:45 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:18.501 12:07:45 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:18.760 12:07:45 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:31:18.760 12:07:45 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:31:18.760 12:07:45 chaining -- bdev/chaining.sh@132 -- # bperfpid=1855604 00:31:18.760 12:07:45 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1855604 00:31:18.760 12:07:45 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@827 -- # '[' -z 1855604 ']' 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:18.760 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:18.760 12:07:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:18.760 [2024-05-14 12:07:45.689765] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:18.760 [2024-05-14 12:07:45.689830] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1855604 ] 00:31:18.760 [2024-05-14 12:07:45.819980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:19.020 [2024-05-14 12:07:45.927273] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:19.588 12:07:46 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:19.588 12:07:46 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:19.588 12:07:46 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:31:19.588 12:07:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:19.588 12:07:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:19.847 malloc0 00:31:19.847 true 00:31:19.847 true 00:31:19.847 [2024-05-14 12:07:46.765915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:19.847 crypto0 00:31:19.847 [2024-05-14 12:07:46.773940] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:19.847 crypto1 00:31:19.847 12:07:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:19.847 12:07:46 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:19.847 Running I/O for 5 seconds... 00:31:25.122 00:31:25.122 Latency(us) 00:31:25.122 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:25.122 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:25.122 Verification LBA range: start 0x0 length 0x2000 00:31:25.122 crypto1 : 5.01 11389.07 44.49 0.00 0.00 22415.87 6439.62 15500.69 00:31:25.122 =================================================================================================================== 00:31:25.122 Total : 11389.07 44.49 0.00 0.00 22415.87 6439.62 15500.69 00:31:25.122 0 00:31:25.122 12:07:51 chaining -- bdev/chaining.sh@146 -- # killprocess 1855604 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@946 -- # '[' -z 1855604 ']' 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@950 -- # kill -0 1855604 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@951 -- # uname 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1855604 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1855604' 00:31:25.122 killing process with pid 1855604 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@965 -- # kill 1855604 00:31:25.122 Received shutdown signal, test time was about 5.000000 seconds 00:31:25.122 00:31:25.122 Latency(us) 00:31:25.122 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:25.122 =================================================================================================================== 00:31:25.122 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:25.122 12:07:51 chaining -- common/autotest_common.sh@970 -- # wait 1855604 00:31:25.382 12:07:52 chaining -- bdev/chaining.sh@152 -- # bperfpid=1856430 00:31:25.382 12:07:52 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1856430 00:31:25.382 12:07:52 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@827 -- # '[' -z 1856430 ']' 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:25.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:25.382 12:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:25.382 [2024-05-14 12:07:52.274235] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:25.382 [2024-05-14 12:07:52.274304] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1856430 ] 00:31:25.382 [2024-05-14 12:07:52.401570] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:25.641 [2024-05-14 12:07:52.508445] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.209 12:07:53 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:26.209 12:07:53 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:26.209 12:07:53 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:31:26.209 12:07:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:26.209 12:07:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:26.468 malloc0 00:31:26.468 true 00:31:26.468 true 00:31:26.468 [2024-05-14 12:07:53.343156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:31:26.468 [2024-05-14 12:07:53.343203] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:26.468 [2024-05-14 12:07:53.343223] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d4450 00:31:26.468 [2024-05-14 12:07:53.343236] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:26.468 [2024-05-14 12:07:53.344314] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:26.468 [2024-05-14 12:07:53.344341] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:31:26.468 pt0 00:31:26.468 [2024-05-14 12:07:53.351187] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:26.468 crypto0 00:31:26.468 [2024-05-14 12:07:53.359207] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:31:26.468 crypto1 00:31:26.468 12:07:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:26.468 12:07:53 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:26.468 Running I/O for 5 seconds... 00:31:31.796 00:31:31.796 Latency(us) 00:31:31.796 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:31.796 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:31.796 Verification LBA range: start 0x0 length 0x2000 00:31:31.796 crypto1 : 5.02 9034.85 35.29 0.00 0.00 28257.21 6667.58 17096.35 00:31:31.796 =================================================================================================================== 00:31:31.796 Total : 9034.85 35.29 0.00 0.00 28257.21 6667.58 17096.35 00:31:31.796 0 00:31:31.796 12:07:58 chaining -- bdev/chaining.sh@167 -- # killprocess 1856430 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@946 -- # '[' -z 1856430 ']' 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@950 -- # kill -0 1856430 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@951 -- # uname 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1856430 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1856430' 00:31:31.796 killing process with pid 1856430 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@965 -- # kill 1856430 00:31:31.796 Received shutdown signal, test time was about 5.000000 seconds 00:31:31.796 00:31:31.796 Latency(us) 00:31:31.796 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:31.796 =================================================================================================================== 00:31:31.796 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@970 -- # wait 1856430 00:31:31.796 12:07:58 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:31:31.796 12:07:58 chaining -- bdev/chaining.sh@170 -- # killprocess 1856430 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@946 -- # '[' -z 1856430 ']' 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@950 -- # kill -0 1856430 00:31:31.796 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 950: kill: (1856430) - No such process 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@973 -- # echo 'Process with pid 1856430 is not found' 00:31:31.796 Process with pid 1856430 is not found 00:31:31.796 12:07:58 chaining -- bdev/chaining.sh@171 -- # wait 1856430 00:31:31.796 12:07:58 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:31.796 12:07:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@296 -- # e810=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@297 -- # x722=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@298 -- # mlx=() 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:31:31.796 12:07:58 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@336 -- # return 1 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:31:31.797 WARNING: No supported devices were found, fallback requested for tcp test 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:31:31.797 Cannot find device "nvmf_tgt_br" 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@155 -- # true 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:31:31.797 Cannot find device "nvmf_tgt_br2" 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@156 -- # true 00:31:31.797 12:07:58 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:31:32.055 Cannot find device "nvmf_tgt_br" 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@158 -- # true 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:31:32.055 Cannot find device "nvmf_tgt_br2" 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@159 -- # true 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:31:32.055 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@162 -- # true 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:31:32.055 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@163 -- # true 00:31:32.055 12:07:58 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:31:32.055 12:07:59 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:31:32.312 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:31:32.312 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.140 ms 00:31:32.312 00:31:32.312 --- 10.0.0.2 ping statistics --- 00:31:32.312 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:32.312 rtt min/avg/max/mdev = 0.140/0.140/0.140/0.000 ms 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:31:32.312 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:31:32.312 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.075 ms 00:31:32.312 00:31:32.312 --- 10.0.0.3 ping statistics --- 00:31:32.312 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:32.312 rtt min/avg/max/mdev = 0.075/0.075/0.075/0.000 ms 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:31:32.312 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:31:32.312 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.045 ms 00:31:32.312 00:31:32.312 --- 10.0.0.1 ping statistics --- 00:31:32.312 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:31:32.312 rtt min/avg/max/mdev = 0.045/0.045/0.045/0.000 ms 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@433 -- # return 0 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:31:32.312 12:07:59 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:31:32.569 12:07:59 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:31:32.569 12:07:59 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.569 12:07:59 chaining -- nvmf/common.sh@481 -- # nvmfpid=1857622 00:31:32.569 12:07:59 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:31:32.569 12:07:59 chaining -- nvmf/common.sh@482 -- # waitforlisten 1857622 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@827 -- # '[' -z 1857622 ']' 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:32.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:32.569 12:07:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:32.569 [2024-05-14 12:07:59.483844] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:32.569 [2024-05-14 12:07:59.483910] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:32.569 [2024-05-14 12:07:59.612703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:32.827 [2024-05-14 12:07:59.717166] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:31:32.827 [2024-05-14 12:07:59.717217] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:31:32.827 [2024-05-14 12:07:59.717231] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:31:32.827 [2024-05-14 12:07:59.717244] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:31:32.827 [2024-05-14 12:07:59.717255] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:31:32.827 [2024-05-14 12:07:59.717299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:33.393 12:08:00 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.393 12:08:00 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:31:33.393 12:08:00 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:33.393 12:08:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.393 malloc0 00:31:33.393 [2024-05-14 12:08:00.457701] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:31:33.393 [2024-05-14 12:08:00.473665] nvmf_rpc.c: 610:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:31:33.393 [2024-05-14 12:08:00.473928] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:33.652 12:08:00 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:31:33.652 12:08:00 chaining -- bdev/chaining.sh@189 -- # bperfpid=1857770 00:31:33.652 12:08:00 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:31:33.652 12:08:00 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1857770 /var/tmp/bperf.sock 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@827 -- # '[' -z 1857770 ']' 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:31:33.652 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:33.652 12:08:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:33.652 [2024-05-14 12:08:00.545881] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:33.652 [2024-05-14 12:08:00.545944] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1857770 ] 00:31:33.652 [2024-05-14 12:08:00.668149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.911 [2024-05-14 12:08:00.773033] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.479 12:08:01 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:34.479 12:08:01 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:34.479 12:08:01 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:31:34.479 12:08:01 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:31:35.047 [2024-05-14 12:08:01.887658] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:35.047 nvme0n1 00:31:35.047 true 00:31:35.047 crypto0 00:31:35.047 12:08:01 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:31:35.047 Running I/O for 5 seconds... 00:31:40.328 00:31:40.328 Latency(us) 00:31:40.328 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:40.328 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:31:40.328 Verification LBA range: start 0x0 length 0x2000 00:31:40.328 crypto0 : 5.02 8255.00 32.25 0.00 0.00 30914.99 2920.63 32369.09 00:31:40.328 =================================================================================================================== 00:31:40.328 Total : 8255.00 32.25 0.00 0.00 30914.99 2920.63 32369.09 00:31:40.328 0 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@205 -- # sequence=82864 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:40.328 12:08:07 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@206 -- # encrypt=41432 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:40.587 12:08:07 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@207 -- # decrypt=41432 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:40.850 12:08:07 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:41.111 12:08:08 chaining -- bdev/chaining.sh@208 -- # crc32c=82864 00:31:41.111 12:08:08 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:31:41.111 12:08:08 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:31:41.111 12:08:08 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:31:41.111 12:08:08 chaining -- bdev/chaining.sh@214 -- # killprocess 1857770 00:31:41.111 12:08:08 chaining -- common/autotest_common.sh@946 -- # '[' -z 1857770 ']' 00:31:41.111 12:08:08 chaining -- common/autotest_common.sh@950 -- # kill -0 1857770 00:31:41.111 12:08:08 chaining -- common/autotest_common.sh@951 -- # uname 00:31:41.111 12:08:08 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:41.111 12:08:08 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1857770 00:31:41.112 12:08:08 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:31:41.112 12:08:08 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:31:41.112 12:08:08 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1857770' 00:31:41.112 killing process with pid 1857770 00:31:41.112 12:08:08 chaining -- common/autotest_common.sh@965 -- # kill 1857770 00:31:41.112 Received shutdown signal, test time was about 5.000000 seconds 00:31:41.112 00:31:41.112 Latency(us) 00:31:41.112 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:41.112 =================================================================================================================== 00:31:41.112 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:41.112 12:08:08 chaining -- common/autotest_common.sh@970 -- # wait 1857770 00:31:41.370 12:08:08 chaining -- bdev/chaining.sh@219 -- # bperfpid=1858717 00:31:41.370 12:08:08 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:31:41.370 12:08:08 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1858717 /var/tmp/bperf.sock 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@827 -- # '[' -z 1858717 ']' 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:31:41.370 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:31:41.370 12:08:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:41.370 [2024-05-14 12:08:08.400810] Starting SPDK v24.05-pre git sha1 b68ae4fb9 / DPDK 24.03.0 initialization... 00:31:41.370 [2024-05-14 12:08:08.400867] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1858717 ] 00:31:41.629 [2024-05-14 12:08:08.513172] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:41.629 [2024-05-14 12:08:08.611957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:31:42.196 12:08:09 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:31:42.196 12:08:09 chaining -- common/autotest_common.sh@860 -- # return 0 00:31:42.196 12:08:09 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:31:42.196 12:08:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:31:42.455 [2024-05-14 12:08:09.507245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:31:42.455 nvme0n1 00:31:42.455 true 00:31:42.455 crypto0 00:31:42.455 12:08:09 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:31:42.714 Running I/O for 5 seconds... 00:31:47.990 00:31:47.990 Latency(us) 00:31:47.990 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:47.990 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:31:47.990 Verification LBA range: start 0x0 length 0x200 00:31:47.990 crypto0 : 5.01 1683.70 105.23 0.00 0.00 18627.11 1040.03 18919.96 00:31:47.990 =================================================================================================================== 00:31:47.990 Total : 1683.70 105.23 0.00 0.00 18627.11 1040.03 18919.96 00:31:47.990 0 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@233 -- # sequence=16858 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:31:47.990 12:08:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@234 -- # encrypt=8429 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:47.990 12:08:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@235 -- # decrypt=8429 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:31:48.249 12:08:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:31:48.508 12:08:15 chaining -- bdev/chaining.sh@236 -- # crc32c=16858 00:31:48.508 12:08:15 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:31:48.508 12:08:15 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:31:48.508 12:08:15 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:31:48.508 12:08:15 chaining -- bdev/chaining.sh@242 -- # killprocess 1858717 00:31:48.508 12:08:15 chaining -- common/autotest_common.sh@946 -- # '[' -z 1858717 ']' 00:31:48.508 12:08:15 chaining -- common/autotest_common.sh@950 -- # kill -0 1858717 00:31:48.508 12:08:15 chaining -- common/autotest_common.sh@951 -- # uname 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1858717 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1858717' 00:31:48.509 killing process with pid 1858717 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@965 -- # kill 1858717 00:31:48.509 Received shutdown signal, test time was about 5.000000 seconds 00:31:48.509 00:31:48.509 Latency(us) 00:31:48.509 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:48.509 =================================================================================================================== 00:31:48.509 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:48.509 12:08:15 chaining -- common/autotest_common.sh@970 -- # wait 1858717 00:31:48.768 12:08:15 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@117 -- # sync 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@120 -- # set +e 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:31:48.768 rmmod nvme_tcp 00:31:48.768 rmmod nvme_fabrics 00:31:48.768 rmmod nvme_keyring 00:31:48.768 12:08:15 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:31:49.027 12:08:15 chaining -- nvmf/common.sh@124 -- # set -e 00:31:49.027 12:08:15 chaining -- nvmf/common.sh@125 -- # return 0 00:31:49.027 12:08:15 chaining -- nvmf/common.sh@489 -- # '[' -n 1857622 ']' 00:31:49.027 12:08:15 chaining -- nvmf/common.sh@490 -- # killprocess 1857622 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@946 -- # '[' -z 1857622 ']' 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@950 -- # kill -0 1857622 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@951 -- # uname 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1857622 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1857622' 00:31:49.027 killing process with pid 1857622 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@965 -- # kill 1857622 00:31:49.027 [2024-05-14 12:08:15.913058] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:31:49.027 12:08:15 chaining -- common/autotest_common.sh@970 -- # wait 1857622 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:49.287 12:08:16 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:49.287 12:08:16 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:49.287 12:08:16 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:31:49.287 12:08:16 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:31:49.287 00:31:49.287 real 0m45.898s 00:31:49.287 user 0m58.569s 00:31:49.287 sys 0m13.284s 00:31:49.287 12:08:16 chaining -- common/autotest_common.sh@1122 -- # xtrace_disable 00:31:49.287 12:08:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:31:49.287 ************************************ 00:31:49.287 END TEST chaining 00:31:49.287 ************************************ 00:31:49.287 12:08:16 -- spdk/autotest.sh@359 -- # [[ 0 -eq 1 ]] 00:31:49.287 12:08:16 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:31:49.287 12:08:16 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:31:49.287 12:08:16 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:31:49.287 12:08:16 -- spdk/autotest.sh@376 -- # trap - SIGINT SIGTERM EXIT 00:31:49.287 12:08:16 -- spdk/autotest.sh@378 -- # timing_enter post_cleanup 00:31:49.287 12:08:16 -- common/autotest_common.sh@720 -- # xtrace_disable 00:31:49.287 12:08:16 -- common/autotest_common.sh@10 -- # set +x 00:31:49.287 12:08:16 -- spdk/autotest.sh@379 -- # autotest_cleanup 00:31:49.287 12:08:16 -- common/autotest_common.sh@1388 -- # local autotest_es=0 00:31:49.287 12:08:16 -- common/autotest_common.sh@1389 -- # xtrace_disable 00:31:49.287 12:08:16 -- common/autotest_common.sh@10 -- # set +x 00:31:54.609 INFO: APP EXITING 00:31:54.609 INFO: killing all VMs 00:31:54.609 INFO: killing vhost app 00:31:54.609 INFO: EXIT DONE 00:31:57.904 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:31:57.904 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:31:57.904 Waiting for block devices as requested 00:31:57.904 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:31:57.904 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:31:57.904 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:31:57.904 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:31:58.164 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:31:58.164 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:31:58.164 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:31:58.423 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:31:58.423 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:31:58.423 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:31:58.682 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:31:58.682 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:31:58.682 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:31:58.941 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:31:58.941 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:31:58.941 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:31:59.200 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:32:03.392 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:32:03.392 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:32:03.392 Cleaning 00:32:03.392 Removing: /var/run/dpdk/spdk0/config 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:32:03.392 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:32:03.392 Removing: /var/run/dpdk/spdk0/hugepage_info 00:32:03.393 Removing: /dev/shm/nvmf_trace.0 00:32:03.393 Removing: /dev/shm/spdk_tgt_trace.pid1623567 00:32:03.393 Removing: /var/run/dpdk/spdk0 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1622717 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1623567 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1624098 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1624956 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1625184 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1626434 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1626461 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1626746 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1629362 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1630704 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1630930 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1631331 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1631596 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1631943 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1632178 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1632397 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1632615 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1633332 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1635946 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1636192 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1636498 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1636722 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1636750 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1636976 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1637175 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1637369 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1637640 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1637932 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1638125 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1638325 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1638522 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1638737 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1639043 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1639276 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1639468 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1639674 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1639872 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1640158 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1640425 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1640624 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1640826 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1641022 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1641260 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1641565 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1641792 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1642162 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1642391 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1642735 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1643105 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1643410 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1643675 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1644040 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1644243 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1644529 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1645000 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1645299 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1645406 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1649545 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1651253 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1653451 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1654346 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1655426 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1655789 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1655810 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1655976 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1659795 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1660324 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1661241 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1661450 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1666781 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1671539 00:32:03.393 Removing: /var/run/dpdk/spdk_pid1676390 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1687421 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1697849 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1708833 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1721909 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1733976 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1746308 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1750303 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1753260 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1758264 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1760839 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1765959 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1769239 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1774854 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1777444 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1783916 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1785952 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1792591 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1794623 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1800728 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1802642 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1806946 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1807307 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1807657 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1808019 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1808451 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1809225 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1809906 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1810341 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1811951 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1813651 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1815698 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1817117 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1818720 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1820330 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1821932 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1823229 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1823811 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1824311 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1826313 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1828165 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1830013 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1831075 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1832311 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1832859 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1832880 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1833114 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1833319 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1833498 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1834575 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1836242 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1837741 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1838590 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1839852 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1840051 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1840077 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1840262 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1841083 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1841711 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1842109 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1844115 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1845973 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1847808 00:32:03.653 Removing: /var/run/dpdk/spdk_pid1848870 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1850101 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1850649 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1850692 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1854741 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1854948 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1854988 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1855181 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1855391 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1855604 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1856430 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1857770 00:32:03.912 Removing: /var/run/dpdk/spdk_pid1858717 00:32:03.912 Clean 00:32:03.912 12:08:30 -- common/autotest_common.sh@1447 -- # return 0 00:32:03.912 12:08:30 -- spdk/autotest.sh@380 -- # timing_exit post_cleanup 00:32:03.912 12:08:30 -- common/autotest_common.sh@726 -- # xtrace_disable 00:32:03.912 12:08:30 -- common/autotest_common.sh@10 -- # set +x 00:32:03.912 12:08:30 -- spdk/autotest.sh@382 -- # timing_exit autotest 00:32:03.912 12:08:30 -- common/autotest_common.sh@726 -- # xtrace_disable 00:32:03.912 12:08:30 -- common/autotest_common.sh@10 -- # set +x 00:32:03.913 12:08:30 -- spdk/autotest.sh@383 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:32:04.172 12:08:31 -- spdk/autotest.sh@385 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:32:04.172 12:08:31 -- spdk/autotest.sh@385 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:32:04.172 12:08:31 -- spdk/autotest.sh@387 -- # hash lcov 00:32:04.172 12:08:31 -- spdk/autotest.sh@387 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:32:04.172 12:08:31 -- spdk/autotest.sh@389 -- # hostname 00:32:04.172 12:08:31 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:32:04.172 geninfo: WARNING: invalid characters removed from testname! 00:32:30.744 12:08:54 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:31.680 12:08:58 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:34.216 12:09:01 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:36.815 12:09:03 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:39.351 12:09:06 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:41.887 12:09:08 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:32:44.423 12:09:11 -- spdk/autotest.sh@396 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:32:44.683 12:09:11 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:32:44.683 12:09:11 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:32:44.683 12:09:11 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:32:44.683 12:09:11 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:32:44.683 12:09:11 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:44.683 12:09:11 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:44.683 12:09:11 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:44.683 12:09:11 -- paths/export.sh@5 -- $ export PATH 00:32:44.683 12:09:11 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:44.683 12:09:11 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:44.683 12:09:11 -- common/autobuild_common.sh@437 -- $ date +%s 00:32:44.683 12:09:11 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715681351.XXXXXX 00:32:44.683 12:09:11 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715681351.g05SlZ 00:32:44.683 12:09:11 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:32:44.683 12:09:11 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:32:44.683 12:09:11 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:32:44.683 12:09:11 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:32:44.683 12:09:11 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:32:44.683 12:09:11 -- common/autobuild_common.sh@453 -- $ get_config_params 00:32:44.683 12:09:11 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:32:44.683 12:09:11 -- common/autotest_common.sh@10 -- $ set +x 00:32:44.683 12:09:11 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:32:44.683 12:09:11 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:32:44.683 12:09:11 -- pm/common@17 -- $ local monitor 00:32:44.683 12:09:11 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:44.683 12:09:11 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:44.683 12:09:11 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:44.683 12:09:11 -- pm/common@21 -- $ date +%s 00:32:44.683 12:09:11 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:44.683 12:09:11 -- pm/common@21 -- $ date +%s 00:32:44.683 12:09:11 -- pm/common@25 -- $ sleep 1 00:32:44.683 12:09:11 -- pm/common@21 -- $ date +%s 00:32:44.683 12:09:11 -- pm/common@21 -- $ date +%s 00:32:44.683 12:09:11 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715681351 00:32:44.683 12:09:11 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715681351 00:32:44.683 12:09:11 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715681351 00:32:44.683 12:09:11 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715681351 00:32:44.683 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715681351_collect-vmstat.pm.log 00:32:44.683 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715681351_collect-cpu-load.pm.log 00:32:44.683 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715681351_collect-cpu-temp.pm.log 00:32:44.683 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715681351_collect-bmc-pm.bmc.pm.log 00:32:45.621 12:09:12 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:32:45.621 12:09:12 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:32:45.621 12:09:12 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:45.621 12:09:12 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:32:45.621 12:09:12 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:32:45.621 12:09:12 -- spdk/autopackage.sh@19 -- $ timing_finish 00:32:45.621 12:09:12 -- common/autotest_common.sh@732 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:32:45.621 12:09:12 -- common/autotest_common.sh@733 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:32:45.621 12:09:12 -- common/autotest_common.sh@735 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:32:45.621 12:09:12 -- spdk/autopackage.sh@20 -- $ exit 0 00:32:45.621 12:09:12 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:32:45.621 12:09:12 -- pm/common@29 -- $ signal_monitor_resources TERM 00:32:45.621 12:09:12 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:32:45.621 12:09:12 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:45.621 12:09:12 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:32:45.621 12:09:12 -- pm/common@44 -- $ pid=1869674 00:32:45.621 12:09:12 -- pm/common@50 -- $ kill -TERM 1869674 00:32:45.621 12:09:12 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:45.621 12:09:12 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:32:45.621 12:09:12 -- pm/common@44 -- $ pid=1869676 00:32:45.621 12:09:12 -- pm/common@50 -- $ kill -TERM 1869676 00:32:45.621 12:09:12 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:45.621 12:09:12 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:32:45.621 12:09:12 -- pm/common@44 -- $ pid=1869677 00:32:45.621 12:09:12 -- pm/common@50 -- $ kill -TERM 1869677 00:32:45.621 12:09:12 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:32:45.621 12:09:12 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:32:45.621 12:09:12 -- pm/common@44 -- $ pid=1869708 00:32:45.621 12:09:12 -- pm/common@50 -- $ sudo -E kill -TERM 1869708 00:32:45.880 + [[ -n 1509495 ]] 00:32:45.880 + sudo kill 1509495 00:32:45.888 [Pipeline] } 00:32:45.902 [Pipeline] // stage 00:32:45.906 [Pipeline] } 00:32:45.919 [Pipeline] // timeout 00:32:45.923 [Pipeline] } 00:32:45.936 [Pipeline] // catchError 00:32:45.961 [Pipeline] } 00:32:45.977 [Pipeline] // wrap 00:32:45.982 [Pipeline] } 00:32:45.996 [Pipeline] // catchError 00:32:46.003 [Pipeline] stage 00:32:46.005 [Pipeline] { (Epilogue) 00:32:46.020 [Pipeline] catchError 00:32:46.022 [Pipeline] { 00:32:46.035 [Pipeline] echo 00:32:46.036 Cleanup processes 00:32:46.041 [Pipeline] sh 00:32:46.321 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:46.321 1869789 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:32:46.321 1869995 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:46.336 [Pipeline] sh 00:32:46.616 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:46.617 ++ grep -v 'sudo pgrep' 00:32:46.617 ++ awk '{print $1}' 00:32:46.617 + sudo kill -9 1869789 00:32:46.627 [Pipeline] sh 00:32:46.912 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:32:56.911 [Pipeline] sh 00:32:57.198 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:32:57.198 Artifacts sizes are good 00:32:57.214 [Pipeline] archiveArtifacts 00:32:57.223 Archiving artifacts 00:32:57.374 [Pipeline] sh 00:32:57.658 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:32:57.676 [Pipeline] cleanWs 00:32:57.686 [WS-CLEANUP] Deleting project workspace... 00:32:57.686 [WS-CLEANUP] Deferred wipeout is used... 00:32:57.693 [WS-CLEANUP] done 00:32:57.695 [Pipeline] } 00:32:57.716 [Pipeline] // catchError 00:32:57.730 [Pipeline] sh 00:32:58.014 + logger -p user.info -t JENKINS-CI 00:32:58.025 [Pipeline] } 00:32:58.042 [Pipeline] // stage 00:32:58.048 [Pipeline] } 00:32:58.066 [Pipeline] // node 00:32:58.072 [Pipeline] End of Pipeline 00:32:58.116 Finished: SUCCESS